Wednesday, April 22, 2009

Search Engine Optimization(SEO)

Search engine optimization or SEO is the art of placing your website in the first few pages of a search engine for a strategically defined set of keywords. In simple words it means that your website will appear on the first page of a search engine like Google, when someone searches for your product or service.

Typically, the earlier a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website primarily involves editing its content and HTML coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines.

The acronym "SEO" can also refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems and shopping carts that are easy to optimize.

Another class of techniques, known as black hat SEO or Spamdexing, use methods such as link farms and keyword stuffing that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.

Why Search Engine Optimization?

# Major search engines command over 400 million searches everyday, day after day. A well designed Search engine optimization(SEO) program helps you get this piece of the pie, which you might be losing otherwise to your competition.
# SEO offers a much better return on investment than other traditional forms of internet marketing like banner campaigns and email marketing.
# Search Engine Optimization helps you capture targeted traffic... people who are already looking for the product or service you offer.
# Search Engine Optimization by an efficient SEO Company is a long term and permanent answer to your traffic woes. Once a website has been optimized for search engines it can stay at the top for long periods of time.

Webmasters with search engines:
By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEOs. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,was created to discuss and minimize the damaging effects of aggressive web content providers.

SEO companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban. Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization. Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information.


Getting indexed


he leading search engines, Google, Yahoo! and Microsoft, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click. Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results. Yahoo's paid inclusion program has drawn criticism from advertisers and competitors.Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review. Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.

Preventing crawling
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.

Search engine optimization methodology includes:
:
1. Website Review:
To rank on Google or any other search engine, a site needs to be indexed first. Hence, the first step in the process of search engine optimization is to ensure that your site pages can be crawled and indexed by the search engine spiders. In this step we analyze your code and identify possible spider stoppers in a website. These may include broken links, missing tags, complicated link structure and other subtle factors that may have been overlooked when the site was designed.

2. Goal Analysis
Why are you indulging in SEO services? What do you expect to gain from your SEO campaign? Do you have practical search engine ranking targets that you would like to achieve?

It is important to have goals clearly defined in an SEO campaign. These goals can be in terms of an increase in revenue from organic search engine traffic, increase in ROI, increase in traffic or just rankings for branding. Keeping goals in perspective, customize and present the best SEO strategy possible given the time-frame, resources and other practical constraints.

3. Competition Analysis

This step involves studying what your competition is doing. Websites of competitors, which have undergone search engine optimization, offer valuable keyword and optimization insights. Analyzing an already optimized site allows us to determine their lead over your site with regards to rankings. Then it determine SEO optimization techniques being employed and the segments being targeted actively on search engines. After this step, able to tell you exactly what your competitor is doing and what you should do to beat them on search engines.

4. Keyword Identification
Many online businesses, despite having great search engine rankings, either do not get enough traffic to their site or do not convert enough visitors. A major reason for this is because the keywords that they are targeting may not be the keywords that are being searched for on the search engines. Keyword identification is a very important part of search engine optimization and includes researching keywords that will not only get great traffic but are also most relevant to your business.

5. On Page Optimization

This step in the process of SEO involves the actual optimization of your web pages. Here, pages will be optimized with regards to tags, link structures, images, body text, and other visible and invisible parts.

6. Building Incoming Links
Good incoming links are often the difference between good rankings and great rankings. Incoming links can be reciprocal links or one way links from directories, articles and news releases. Our link popularity campaigns are human powered and completely manual.

7. Search Engine Submissions

Manual Submissions to search engines and directories is a process often referred as search engine submissions. Many advertise a quick and simple software or service which helps submit to over 1000 engines for a few dollars, but few tell you that it is the top 10 engines that command over 85% of the internet's search engine traffic.

8. Analysis and Tweaking
Search engine optimization is a long-term solution to your traffic woes. Our comprehensive seo services involve continuous fine tuning of the website based on traffic trends and ranking trends. Search engines often change their algorithms... SEO tweak your website to compensate of the changes, hence enabling you to stay on top.

White hat versus black hat
:
SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques of which search engines do not approve. The search engines attempt to minimize the effect of the latter, among them spamdexing. Some industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.

No comments: