Lesson 7

Becoming an SEO EXPERT


Once you’ve finished this book, which is the first portion of this SEO course, you should consider yourself an Intermediate SEO.

So far, this book has taught you much. You now know all about keywords. You know what they are, how to find them, and where to place them. You’ve learned the basics and the importance of incoming links. You now know *more* about the nuances of ‘Link Popularity’ than probably any of your competitors. In fact, just knowing that some links are better than others, and how to tell the difference between natural and artificial link structures, puts you way ahead of some of the so-called “Pros” in the SEO business – believe it! You also know how to evaluate the quality of an incoming link. You know there are certain types of Web pages you should avoid linking to because they can hurt your rankings. And, you even know what the ultimate, the very best link, looks like; ideally the only link on the homepage of an authoritative, high PageRank site. And, although such a perfect “10” link isn’t often realistically achievable, this standard gives you the litmus test by which you can evaluate all other links against. You know about buying links. That purchasing some links can get you penalized by Google while others will help you rank better. And where those links appear on the linking page, matters!

You are now armed with at least 18 actionable linking sources and strategies:

  1. Links from mainstream directories like Yahoo and DMOZ.
  2. Specialty directory links within your niche market.
  3. Professional services links from suppliers and merchants with whom you do business.
  4. Links from Associations where you are an alumnus or a member.
  5. Links placed within your published ‘expert articles’ and syndicated to other sites.
  6. Press Releases that get indexed and build links.
  7. Links coming from testimonials you place on other people’s sites.
  8. Links from your forum comments you place on other sites.
  9. Links negotiated from learning who’s linking to your competitors.
  10. Reciprocal links, when they are on topic, make sense and look natural.
  11. Blogging to attract links.
  12. Buying abandoned Web sites with pre-existing links.
  13. Buying ads in Ezines and newsletters to build long-lasting links.
  14. Special ways to procure those coveted links from .edu and .gov domains.
  15. Providing specialized tools and resources to attract links.
  16. Using your affiliate program to gain links.
  17. Social networking to build links.
  18. Linkbaiting via social media – viral marketing.

You know the importance of hiring (or being) a good writer who can ‘write’ your Web pages to the top of the search listings, and how to smartly syndicate those articles without running into duplicate content issues.

You’ve learned that:

  • The number of links on a page matters.
  • All about run of site links, what they are and how they can hurt your ranking efforts.
  • The importance of maintaining consistency in your link structure.
  • How anchor text dictates what keywords that Google and others *think* your pages are about.
  • Why getting links to your ‘deep’ pages is better than having links only to your homepage.
  • What types of links to avoid.
  • That PageRank measures a site’s importance as reported by the Google Toolbar.
  • There’s a rough 80/20 ‘link balance’ that is factored into a page’s importance ratings.
  • It takes MORE than having a great site with great content to dominate the top rankings—you know it takes calculated strategy and implemented tactics to build links to your great site with great content. Only then can you realistically expect to dominate the top rankings!

In Lesson 4 you learned about the indispensable SSI tool that evaluates the strength of a site in seconds. And, you learned how to spot the weak sites that you can beat in the rankings as well as how to identify sites that are too strong to compete with for very specific keywords. You also learned there are safe and unsafe SEO practices and that some so-called “brand name” sites are white listed—working under a different set of rules.

Lesson 5 gave you everything you need to know about choosing the right Domain Name. You learned that the right name can, not only help your rankings, but can also help you get more clicks from people who recognize the keywords they’re searching for within your URL. These are trusted domains from the SEPOV. You also learned which types of domain names to avoid (.info, .biz, too long, too many dashes, double dashes) because they look spammy and the engines tend not to trust them.

In Lesson 6 you learned the details about how to build a search-friendly Web site while avoiding the mistakes that will handicap your rankings. You learned about a specialized file (robots.txt) and the importance of managing the spiders that crawl your site looking for content to index. You learned that it is VERY bad to change URLs or move Web pages, so planning your site architecture is crucial. And, you learned that if you DO have to move a site, then you’d better study the corresponding Advanced SEO Tutorial that teaches you EXACTLY how to do it so you don’t lose all of your hard SEO work to date. Yes, indeed. Assuming you’ve studied your lessons well, you have all of this knowledge and actually much, much more, at your fingertips to apply to your Web site or your SEO business — or both!

All of this knowledge does, indeed, elevate your level to *at least* Intermediate SEO! Now it’s time for you to move forward… to the Advanced SEO Level.

Becoming an Advanced SEO


As you know, Lessons 3, 4, and 6 of this book referred you to Advanced SEO tutorials — there are 22 of them …and they are now ALL available to you inside the membership area of Cadabra.co.za. These are the tutorials you will need to *master* before you’ll confidently call yourself an Advanced SEO. These 22 Advanced SEO tutorials contain the fine tuning that’ll give you the confidence to command control of your own Web site’s search engine optimization efforts. Then, with a little experience under your belt, we suspect you’ll feel comfortable working on other people’s websites. These are the tutorials that, once mastered, qualify you to even *charge a fee* for your SEO consultation services if you desire to do so. You may be surprised to learn how much demand there is for truly good SEO services and advice these days.

Becoming an SEO Expert

  1. Experience you will gain by doing.
  2. Success you will gain over time after you begin doing.
  3. Competitive Knowledge – The knowledge and competitive intelligence that is typically gathered and compiled by the SEO ‘tools of the trade’ that are zealously used by the professionals. These are the tools that immensely accelerate the process of crunching the numbers while doing site analysis AND recognizing trends, as you build your own professional SEO intuitiveness.
  4. Advanced Mastery – The additional mastery of PPC marketing is, arguably, the final feather in the Expert SEO’s graduation cap. Yes, it’s true that some SEOs focus only on the ‘organic’ rankings while others specialize in nothing but PPC. A great many SEOs do both; and they are able to successfully coordinate the two working in tandem. Regardless of the path you take, you’ll have the building blocks in place to excel in PPC marketing once you’ve built your foundation on ‘organic’ SEO strategies, tactics, and techniques.

Lesson 6

Site Architecture; Making Your Web Site Easy for Search Engines to Index

Website Architecture

Now that you know the importance of Keywords, Links, Domain Names, and Competitive Analysis, it’s time to understand how your Web site should be set up so that search engines will find it, and list it within their index. After all, the search engines can’t rank your pages at the top of the search results if they don’t know about them.  By the way, Indexing is what search engine spiders are doing when they crawl a Web site to collect data about your Web pages. They store that data in a database called an index.

Processing and storing a Web page is referred to as indexing. Therefore, you must ensure your Web pages are as easy for search engines to access and index as possible. Believe it or not, many Web sites are inadvertently or accidently configured to be difficult for search engines to access. And, …some Web sites are actually blocking search engines while their Web site owners are wondering why their site isn’t doing better in the search rankings!

Little do they know it’s because of the indexing-roadblocks they’ve inadvertently placed in the way.

So, pay close attention because here’s where you will learn:

  1. How to avoid the common mistakes that keep Web pages from being indexed, and
  2. How to ensure that all of your important pages get indexed properly to give them a good chance to rank at the top of the search results.

Remember though, making your site search engine friendly, by itself, won’t propel you to the top of the rankings. A search engine friendly site is actually more about avoiding the mistakes that will prevent you from getting indexed or damage your search engine rankings. To achieve that top rank, however, you must totally understand the critical role that keywords, inbound links, overall site strength, etc. play as you use this understanding to your advantage when designing your site.

There are always two important points to remember about search engines and how they relate to your Web site:

  1. The first point is: the quality of your site counts. Search engines make their money through advertising. Showing ads to their users is their profit model, and the more users they have, the more money they make. The way a search engine gets more users is by providing the best search results. This means that, if your site is the most useful site to customers in your keyword category, then search engines want to rank you at or near the top of the search results. Indeed, their revenue stream depends on it.
  2. The second point to remember is: search engines spiders are really just computer programs. More precisely, search engines run a program called a spider that:
  • visits your Web site,
  • reads the text and links on your Web pages,
  • then decides what to do with your Web pages based on the information it reads. They call this activity crawling your Web site. So, search engine spiders are computer programs that crawl Web pages.  And, if you’ve ever used a computer, you know that computer programs break sometimes, especially if you overtax them. You may have noticed that your own computer starts to slow down and may even crash if you have too many applications open. It’s the same with a search engine spider.

If your Web site is laid out in a confusing and disorganized fashion, or if the links between your pages are difficult for a search engine spider to find, your site is not going to be crawled as efficiently as you would like. This means that some of your pages will get missed. It also means your site won’t be crawled very often and your listings won’t be fresh. That puts you at a disadvantage when it comes to getting new pages indexed; and if your pages don’t make it into the index, they certainly can’t be ranked highly. Remember, there are billions of Web pages on the Internet. And search engines have to make the most of their available time and resources to crawl all those pages. It’s your job to make sure crawling your pages is quick and easy for the search engine spiders. Otherwise, you risk having your Web pages ignored by the search engines. Remember, this lesson is focused on making your site spider-friendly. The tactics and strategies covered here won’t rocket you to the top of the search engines (you’ll need to use incoming links and keyword strategies to do that), but they will help you avoid the mistakes that can nuke your rankings by locking you inside the starting gate. In other words, if it’s difficult for search engine spiders to crawl your site, you’ll be handicapped in the ranking race regardless of all your other good efforts.

Keep Your URLs Simple

Search engine spiders find the pages on your Web site by following links. They work in similar fashion to the way you use your browser—only much more quickly. They download a page, scan it for links, and store those links in a list. Once they’re done scanning the page, they grab the first link from the list and repeat the steps until they’ve followed all of the links one by one. Of course, this is a simplified explanation, but it essentially defines the process of how a search engine finds Web pages. Many Web sites, especially e-commerce sites, use dynamically generated URLs.

These are Web addresses that look something like: 


These dynamic URLs are automatically generated by pulling variables out of a database to match the product specifications a customer is looking for. Dynamically generated URLs usually contain lots of non-numerical, non-alphabetical characters, such as ?, &, +, and, =.

For example, a site that sells Hawaiian muumuus might have a page with the following dynamically generated URL:


This is opposed to a static-looking URL, which is a bit easier on the eyes:


Although most search engine spiders are capable of crawling these long and confusing, dynamically generated URLs, it is best if you can avoid using them at all. When all else is equal, a Web site with short, static-looking URLs is more likely to achieve a higher number of pages indexed by the search engines than a comparable site that produces dynamically-generated (DG) Web pages. Dynamic URLs are often the source of duplicate content which can affect your sites ranking.

Many content management systems (CMS) with dynamic URLs generate the same content reachable via multiple varied URLs. It is important to avoid getting those duplicate pages indexed if possible. You may need to make use of the canonical tag to tell the search engines which is the URL to index, and which ones to ignore. For more information on the use of the canonical tag read – Are you using the Canonical Tag Correctly? However, there are times when the advantages of DG pages outweigh the SEO drawbacks. So, IF your site absolutely must rely on dynamically pulling content from a database to create its URLs, it’s still possible to have your URLs appear static by using a tool like mod_rewrite. A mod_rewrite is a tool that can be used to rewrite a dynamic URL as a static URL on the fly. This is commonly done for SEO purposes to improve the pages navigation for spiders. When you are ready to apply this Advanced SEO tactic, be sure to study our extensive mod_rewrite

Advanced SEO Tutorial:

Getting Your Dynamic Sites Completely Indexed with mod_rewrite

By the way, if the tutorial mentioned above seems a bit too complicated for you, then share it with your web or tech people and have them handle the details of turning your dynamic URLs into search engine (and people friendly) web addresses. This is something that you definitely want to get right! And, even though it’s a little complicated, it’s worth the effort as it magically renders your complex, dynamic, ugly looking URL’s into simple links that search engine spiders and PEOPLE just love to follow. In essence, you get to eat your cake and have it too.

Meta Tags: Do They Matter?


Meta tags are non-displayed text written into the source code of your HTML document intended to describe your page to the search engine for the purpose of cataloging the content of your page. A considerable amount of undeserved attention has been given to Meta tags and an enduring myth has evolved in the process.

  • Meta Tag Myth: Meta tags are what propel pages to the top of the search engines (wrong!).
  • Meta Tag Reality: Meta tags, while potentially useful to describe your Web page contents to a web browser or search engine, have no appreciable effect on actual search engine rankings whatsoever. None! There seems to be this semi-absurd ongoing debate as to whether or not Meta tags should be included in your HTML document. Let’s put that debate to rest once and for all. Surprisingly, the answer is a resounding YES! Here’s why.

While it is true that Meta tags will not help your rankings, it is also true that the Meta description tag should absolutely be included in every Web page document that you want described on the search engine results page. That’s because the Meta description tag is used by many search engines as the summary description for your page when your page is listed in the search results. The contents found within the Meta description tag is often the sub-headline and the sales description for your link! (Remember! …your <title> tag is the headline that is displayed in the search results.) The Meta description tag, when displayed in the search engine results, helps the searcher decide whether or not your page is relevant to their search. It’s what compels a real person to click your link. After all, that’s the reason for being listed by the search engine in the first place!

If you omit the Meta description tag, then the search engine is likely to fabricate a description for your site based on arbitrary text gleaned from somewhere on your page. Here’s an example of a terrible, yet real life search engine results description we found when searching for Hawaii scuba diving:

  • Link Title: Scuba Diving Maui Hawaii
  • Summary description: click to go home Now, we’re pretty sure that this company didn’t really want click to go home used as their page description, but that’s what they got because they failed to use a Meta description tag. Another possibility is that the search engine will omit the summary description entirely if it fails to find anything useful within your page to use as a summary. In either case, a potential site visitor is less motivated to click your link if you fail to properly utilize the Meta description tag.

Hence, in every case where you want a description for your link within the search engine results, be certain to include a relevant and enticing Meta description tag.

The following example illustrates the HTML source code located at the very beginning of a very basic Web page. Below you can see the Meta description tag, and its contents, highlighted:

<title>Cell Phone Accessories</title>
<meta name=”description” content=”The latest in cell phone accessories at the lowest prices for every known brand of cell phone on the planet!”>
<meta name=”keywords” content=”cell phones, Leather Cases, Cellphone holders, Antennas, antennaes, chargers, batteries, face plates, flashing batteries, hands free head phones, headphones, range extenders, bateries”> </head>

The only other Meta tag that you may hear discussed is the keyword meta tag, which is no
longer observed or used by major Search Engines:

<html> <head>
<title>Cell Phone Accessories</title>
<meta name=”description” content=”The latest in cell phone accessories at the lowest prices for every known brand of cell phone on the planet!”>
<meta name=”keywords” content=”cell phones, Leather Cases, Cellphone holders, Antennas, antennaes, chargers, batteries, face plates, flashing batteries, hands free head phones, headphones, range extenders, bateries”> </head>

We do not recommend wasting any time at all on the meta keyword tag. The only time you should be concerned about it is if you’re working on an old web site that has spam laden meta keyword tags, if you encounter this, remove them as they can still be seen as a technique used by spammers. Oh, and by the way, NONE of the other Meta tags have any effect on search engine rankings, whatsoever! …never have and probably never will—no matter what you’ve heard!

How to Customize the Way Your Listings Appear in Google

Google now provides a way for you to customize how your listings appear in the search results. Previously, you were limited to just titles and descriptions, but now it’s possible to get star ratings, product images, prices, business addresses and more included with your search results listing. For example, take a look at this cafe listing from Yelp.com


You can see star ratings, number of user reviews and average price range for a meal. Google is displaying this data using a feature they call Rich Snippets—information extracted from indexed pages that have special tags embedded into their HTML code. Those tags come in two forms, microformats and RDFa.

While this might sound complicated, these formats are about as easy to master as regular HTML. And, although developers haven’t yet settled on a standard, the fact remains that you can use either microformats or RFDa (we find microformats a little easier). Then you simply denote certain data on your pages by wrapping them in tags with descriptive class attributes. For example, to create the listing above, you would wrap portions of your page’s data in tags that describe that data as seen below.
Cafe Cakes</span>
4</span> out of 5.
28</span> reviews.

Wrapping everything in a hreview div tag lets Google know it’s a review. Then you use name, rating, count and pricerange span tags to add the other information. So far, we’re seeing these Rich Snippet listings just for restaurants and cafes, but Google is working on rolling them out to more categories.

Google provides examples and tutorials on Rich Snippets for the following:

  • Reviews
  • People
  • Products
  • Businesses and organizations

Currently, business directories and others sites based upon reviewing and categorizing other businesses stand to gain the most from having Rich Snippets added to their pages. However, as Google expands this program it’s likely to become relevant to many other types of Web sites as well. In general, listings that are enhanced with Rich Snippets can expect to increase their click through rate—so we highly recommend them.

Be Careful with Session IDs and Dynamic URLs

Session IDs are unique identifiers, often embedded in URLs that allow a Web site to track a customer from page to page. For example, when shopping at an ecommerce site, session IDs are used to keep track of the items in your shopping cart. For search engine spiders, however, session IDs can cause a problem because they can inadvertently create a huge number of links for the spider to crawl. The danger is that the spider might repeatedly index what is essentially the same Web page over and over. They can get trapped in a loop as each newly crawled page dynamically generates even more links for the spider to follow. They call this a ‘spider trap.’

Here’s how a system that uses session IDs can give the appearance of generating an endless number of pages within a single site. For example, a link with session ID tracking that looks like…


…is served to the spider when it first downloads one of your Web pages. That page is then processed, but when the spider returns to the site to download more pages, it finds another URL that looks like:


It’s actually the same page—only with a different tracking session ID variable. But to the spider it looks like a brand new URL, so the spider can get trapped downloading the same page over and over again. This problem can also result in duplicate content getting indexed in the engine, which can lead to a reduction in ranking. Although Google is constantly striving to improve their ability to crawl session IDs, we recommend you avoid using them whenever possible. However, when you must use them, you should avoid giving search engine spiders access to them. The best plan is to not use session IDs until you actually need to track the state of your customer, such as when they take an action such as adding items to their shopping cart. You can also store your session IDs in cookies instead of your URLs. Most web applications can be configured to store user state in cookies. And, once again, if this sounds complicated, then have your web or tech people handle this element of your Web site architecture. What YOU need to know is that the more dynamic variables you include in your URLs, the harder it will be for search engines to index your pages. Strive to keep your URLs simple and free of dynamic elements.

Sitemaps: What, Why, and How

First lets start out with the simple fact that there are 2 different types of sitemaps. According to Bing, the difference between Sitemap and sitemap is:
Sitemap, the capitalized version, refers to the XML-based files created specifically for the search engine crawlers. This version of the Sitemap provides the crawlers the “most important pages and directories within their sites for crawling and indexing.” sitemap, the lowercase version, is an HTML-based file that is for both the Web site user and the MSNbot. It’s essentially, a simple, clean and organized list of all the pages on your Website. HTML sitemap is an on-site Web page that links to all the other pages on your Web site. It ensures that any spider crawling your site can easily and quickly find and index all of your site’s Web pages. This type of sitemap is for spiders first and foremost but it can also be useful for Web site visitors. By linking your homepage to your HTML sitemap, you ensure that each page on your site is only one click away from your sitemap and only two clicks away from your homepage. This is the optimum Web site structure in terms of making Web pages easy for the search engine spiders to find.

As you now know, search engine spiders find new pages by following links from the pages that are already in their index. Thus, if you want a spider to crawl a new Web page, it needs to find a link from a page that is already indexed in the search engine. However, unless you have a very small site, linking to every page on your site from your homepage would look messy and unprofessional to your customers. Therefore the HTML sitemap enables you to accomplish this objective cleanly and professionally. Your Sitemap provides a list of links that Google’s spider can easily follow. This will help you get new pages indexed without cluttering your home page with links. That’s assuming of course, that Google has already indexed your sitemap. So, by placing a link on your home page to the Sitemap, and then links from the sitemap to the rest of your important pages, you make all of your site’s pages easy to find and index.

We’ve mentioned here the importance of linking to your HTML sitemap from your home page, but, for good measure, you should also place a link to your HTML sitemap on every single page on your site. That way, even if a search engine can’t reach your homepage for some reason, it can still easily access your sitemap and find all your other pages. By the way, this Web site architectural element should be considered standard operating procedure as the search engines themselves actually recommend you use a Sitemap to ensure that all of your Web pages get indexed.

HTML sitemaps for Large Sites

If you have a large site, you may be wondering whether it’s better to create one large HTML sitemap or several smaller ones. There are a couple of factors to consider:
First, the degree to which search engines will index pages and follow links on a page is largely determined by the quality of links pointing to that page (or to the site the page is on). If a site doesn’t have many incoming links, then it’s a good idea to make pages smaller and put fewer links on them. A good rule of thumb is to keep your pages under 101k file size of HTML code, and to put no more than 100 links on a page. Popular sites can easily get more of their pages indexed, but to be safe, use 101k of HTML code (don’t count the file size of images) and 100 links as the upper limit. Therefore, if your entire site is fewer than 100 pages, and you can create a HTML sitemap page smaller in file size than 101k, then it’s beneficial to use only one sitemap that points a search engine spider to the rest of your site.

There is an advantage to having only one HTML sitemap placed within the root domain. It enables the search engine spider to find and index all of your pages without having to traverse your site any deeper than two links beyond your home page. That’s one link from your home page to your sitemap, then one more from your sitemap to every other page on your site. This makes it easiest for the spiders to find every page on your site. However, once a HTML sitemap approaches 100 or so links or the file size of your sitemap Web page file size approaches 101k (excluding images), then it’s time to start splitting up your sitemap into smaller ones. We’d suggest linking to each sitemap from all your pages. Five sitemaps, for example, would require five links from each page instead of one. The end result would be that a spider would still only need to follow a maximum of two-links-deep beyond your homepage to reach all your pages. If, for some reason, it isn’t practical to place all five links to your five HTML sitemaps on the home page, then we’d suggest a single link on the home page that points to a master sitemap which, in turn, contains the five links to the five smaller sitemaps. This would require the search engine spider to travel three links deep into your site to locate and index all of your pages—which is still quite good. And again, link to your master HTML sitemap from all your pages, not just your home page.

Finally, we recommend that you avoid forcing a spider to crawl any deeper than three links beyond your home page to locate the rest of your pages. Using the site structure outlined above should allow you to easily accomplish that objective.

XML Sitemaps; How to Get Your Difficult-To-Index Pages Fully Listed

images (1)

You should carefully note the difference between an onsite sitemap (HTML sitemap) and an XML Sitemap. Your Web site should utilize both—as both are an important part of helping your site get, and stay, indexed by search engines. The regular HTML Web page sitemap (as explained in the previous two chapters) is simply an on-site Web page that links to all the other pages on your Web site. It ensures that any spider crawling your site can easily and quickly find and index all of your site’s Web pages. On the other hand, the XML Sitemap (aka a Google Sitemap, although it’s used by Yahoo and Microsoft as well) is a special file that provides search engines with specific directives about what pages to crawl and how often. Search engines are not required to strictly obey these directives but they do tend to use them as guidelines. This type of Sitemap is especially useful for very large sites that want to get all their pages listed. A great example of a large site that NEEDS to have a good XML Sitemap is an eCommerce site that wants to get its entire list of product pages indexed and listed in the
search results.

Please note that neither the HTML sitemap nor the XML Sitemap play any role in where your pages will rank. Both are simply a vehicle for getting your Web pages indexed most efficiently. Where your pages rank depends on your incoming links and other optimization factors. Bing also subscribes to the XML Sitemap protocol. You can submit your XML Sitemap in the same format that you use to submit it to Google by using their Webmaster Tools Service. For more on the Sitemaps protocol and how it can help your pages get (and stay) indexed by the top three search engines, be sure to visit Sitemaps.org:  http://www.sitemaps.org

Regardless, a Google XML Sitemap is really no replacement for clean and crawlable URLs, so a tool like mod_rewrite still comes in handy if you are attempting to simplify your dynamic URLs. If this sounds complicated, give it to your Web or Tech people who will probably tell you, this is actually pretty simple once you do it. An important note: If your site is already ranked in the search engines, be very careful about changing your URLs. Carelessly modifying your URLs after your pages have already been indexed and ranked is one of the worst SEO mistakes you can make! And, if you fail to effectively tell the search engine where to find the new location of the page, then the search engine will assume the page has disappeared and will drop it from their index. Not good. However, if you do change any URL, you must redirect the old URL to the new location. Visitors and search engines that are looking for the old URL will then be automatically redirected (sent) to the new URL, saving you lost search rankings while accommodating your site visitors.

Of course, in a perfect world, you’ll never need to move a Web page or Web site. However, if you must, then this tutorial is critical to your success. Without it you risk causing grave damage to your Web site’s rankings, especially if your pages are already doing well in the search results. You could easily lose all of your rankings if you get this critical procedure wrong. You have been warned! Note that if you’re using mod_rewrite to rewrite your URLs, the 301 redirect can be added to your mod_rewrite code

How to Use Robots.txt for More Targeted Web page Indexing


Your Robots.txt file is a tricky sounding name for a simple text file that is placed in the root directory of your Web site. Its purpose is to provide crawling directions to search engine spiders that are specific to your site. In other words, your robots.txt file tells search engine spiders which pages NOT to index. A common misconception is that a robots.txt file can somehow be used to encourage search engines to crawl a site. Not true! Most pages are eagerly spidered by search engines without requiring additional encouragement. As you are probably now noticing, an important part of SEO is identifying the elements that cause indexing difficulties for the spiders while eliminating these problematic elements. So, why would you ever want to tell a search engine NOT to index some of your pages? Well, because search engine spiders function with limited time and resources when indexing sites. Therefore your site will be better served by focusing on getting your important content, product listings, and sales pages indexed.

Case-in-point: Chances are good that you do NOT want a search engine to index your shopping cart. There is typically no benefit to you when your shopping cart checkout pages show up in the search engine results. Therefore, you would use your robots.txt file to make sure search engines don’t waste time indexing your shopping cart. That way they are more likely to spend their time on your site indexing your more important sales or informational content pages. Other pages you’ll want to keep search engine spiders away from include anything in your cgi- bin folder, as well as directories that contain images or otherwise sensitive company data. Whenever there isn’t any benefit to having a Web page (or image) displayed in the search results, then you should forbid the spiders from indexing it by placing the appropriate command within your robots.txt file.

That will not only help focus the search engine’s resources on your important pages, but will also provide the useful side benefit of protecting your site from hackers who may otherwise use search engine results to acquire sensitive information about your company or site. Search engine spiders are typically voracious about indexing anything they can find on the web, including sensitive areas like password files, so you must be careful. The robots.txt file can help you layer in some of the protection you need.

By the way, there’s one more issue to be aware of that relates to the robots.txt file. A surprising number of sites have inadvertently and unintentionally set up their robots.txt files to prevent search engine spiders from crawling any portion of their Web site (oops!). For example, the following two lines when added to your robots.txt file is enough to keep all major search engines from ever crawling your site. In other words, the following command tells ALL search engine spiders to simply go away:

User-agent: *
Disallow: /

This has been an area of confusion for some people. They use the wrong command and then they wonder why they can’t find their site listed in the search engines. So, be very careful this doesn’t happen to you! If you decide that you want to block a specific search engine spider, you should put the name of the spider to block on the User-agent line—NOT the asterisk. The asterisk (*) symbol is a wildcard meaning all. The Disallow line is where you put the directory that should not be indexed. Then forward slash (/) indicates the root directory, in other words your entire site. As you can see, the robots.txt directive above is a total shut-out of all search engines from your entire site. On the other hand, entries like this:

User-agent: *
Disallow: /cgi-bin/

…should (we say “should” because it’s technically optional for search engines to obey the robots.txt directives) prevent all URL’s in the /cgi-bin/ directory from being crawled. Keep in mind that these directives are case sensitive. If you want the spiders to crawl every Web page it can find on your site, there is no need for a robots.txt file. The only time you actually need to use robots.txt is if you want to restrict the crawler from some portion of your site. Google’s Webmaster Tools provides a report which will tell you exactly what URLs Google has attempted to crawl on your site but were restricted from crawling by your robots.txt file.

You can access Google Webmaster Tools at: http://www.google.com/webmasters/sitemaps

To see this report, go to Google’s Webmaster Tools (sign up and register your site if you haven’t already), click the Diagnostic tab, then click the Web crawl link. Finally, click the report that says URLs restricted by robots.txt to see what pages Google is not indexing due to commands in your robots.txt file. Google’s Webmaster Tools also offers a special robots.txt debugger which allows you to test specific URLs to see if your robots.txt file allows or blocks spider access. If you’re having problems getting pages indexed, be sure to test those pages against your robots.txt file using Google’s tool and see if you have a statement blocking Google’s spider. If you do, Google will show you what line in your robots.txt the blocking statement is on.

Be Careful with using Frames, JavaScript, and Flash


There are a lot of myths and misunderstandings about the way search engines handle Frames, JavaScript, and Flash pages. The fact is that Web pages using these formats can only theoretically be optimized for search engines, but each presents its own unique challenges and difficulties. As a general rule, they’re best avoided, since pages that don’t use them are much easier to optimize for search engines. However, if you find that you must use them, or if you’re optimizing a site that’s already builtaround these technologies, here’s what you need to know to minimize their disadvantages.

How Pros Use Frames and Still Rank at the Top

Ever since we can remember, the use of the frame (also used with frameset) tag has been a thorn in the side of SEOs. They’re tricky to work with and if not done correctly they can kill your site’s chance of being crawled correctly and therefore ranking well in the engines. Because of this we’ve taken the stance that you shouldn’t use the frame tag unless you absolutely have to. The alternative we’ve suggested is the iframe tag, which is similar in nature and has been applied without issue. However, these two tags are different from each other and we’ve decided to take a little time to re-evaluate them and their use within your SEO campaign. First off, frames are an HTML element that pulls content in from another URL to the URL of your choice. It’s like copying everything on a page to mirror it on another page. Sometimes there are solid reasons to use frames on your Web site. Perhaps you have a legacy site that will take too much time and energy to change over. Perhaps you’re doing it for your affiliate campaigns. Regardless of why, if you simply must use the frame or iframe tags on your Web site, here are some guidelines to assist you in overcoming the potential disadvantages.

How Spiders Index Frames

These days Google and Bing are very good about correctly indexing content within <frameset></frameset> and <iframe></iframe> tags. However, the very important thing for you to understand is how the engines interpret the content. They crawl and index the non framed content on the site as one page and then crawl and index the framed content as an entirely separate page. That means they will not associate the framed content with the main page it’s being presented on. Think of the framed content like a large image. The img alt tag tells search engines, and your visitors when the image doesn’t load, what the image is. There is a special tag called the noframe tag, which is designed to instruct users and search engines what the framed content is when frames are disabled. Basic use is something like this…

<noframes>Put your keyword-rich frame describing content here.</noframes>

When using the <noframe> tag, ideally you want it as high up on the page as possible and to contain the text and/or links about the framed page (written using keywords pertinent to your site). This content will then be readable by the search engine spiders as well as by people whose browsers do not support frames.

Note: It is very important that the <noframe> tag is outside of the <frameset> or <iframe> tags so the search engines can find them.

We also feel it’s important to remind you that we do not use frames at all. We feel that frames do not add anything to search engine findability — nor do they add to product sellability. We have even seen problems caused by the framed content getting indexed directly. When this happens users land on a page that potentially has a lack of navigation or other necessary elements that make the page completely dysfunctional. Understanding the frame tag compared to the iframe tag We put together a sample template of a standard page using the frame tag to help display what areas will be indexed, and what won’t.

<title>Title text here will be read by spiders, regardless of frames</title>
<meta name=”description” content=”Text here will be read by spiders that read meta tags, regardless of frames. It is important to note that you should include a META tag description summary of all the frames on this particular page.“>

Also note that many search engines index the ALT attributes in the <IMG> tag. If your site mainly consists of graphics (bad idea!), you can also use the HTML ALT attribute to describe your page.

<meta name=”keywords” content=”Text inside the keywords tag is typically ignored by most search engines regardless of frames“>

Text here will be read by most search engine spiders just as if you were not using frames — this is the place to load descriptive text including your keywords. Make sure that the text is in a readable format, as it will likely get used in the search results for the snippet or description text.

<frameset cols=”25%,50%,25%”>
<frame src=”frame1.html” />
<frame src=”frame2.html” />
<frame src=”frame3.html” />

Text here will also typically get indexed by most engines but results may vary. The content within the
three files referenced above.

frame1.html, frame2.html and frame3.html will not be indexed along with this page.

<noframes>Text here will also typically be read and indexed by spiders.</noframes>

Now, we’ll take a moment to explain and show an example of the iframe tag

The use of the <iframe> tag is increasing to embed dynamic information and all sorts of widgets onto a site. Facebook’s “Like” button widget is an excellent example that uses the <iframe> tag to do its magic. What many don’t realize is that lots of these embedded iframe widgets typically don’t generate a link back to their site – which is one of the main reasons they generated the widget in the first place! However, if you set the code up as below with indexable content within the iframe tag, it will get indexed (including any links in that text).

<iframe src=”http://www.facebook.com/plugins/like.php” scrolling=”no” frameborder=”0″ style=”border:none; overflow:hidden; width:150px; height:50px;” allowTransparency=”true”>Content, and links will get indexed here by most engines as it is visible text on the page. Anything that is pulled in using the iframe tag, will not get indexed with the page. So if you want your iframe powered widgets to generate a link back to your site, make sure and include that code in this area.</iframe>

In summary, the frame tag and frame based Web sites are something to avoid whenever possible, even though there are ways to get some content indexed. The iframe tag on the other hand, when used correctly, can be a good method of link building. Just be aware that the content pulled in by the iframe tag is not going to get indexed as if it were static html code on the page. All in all, the take-away from this topic is to avoid using frames on Web pages that you want indexed in the search engines.

1. Understanding JavaScript

While it is true that Google can now find and follow JavaScript links, it is also true that Yahoo and Bing cannot. Therefore, you should think twice about creating JavaScript links. If you feel they are absolutely essential to your Web site’s overall design scheme, then pay close attention here to ensure you are setting them up properly. JavaScript links typically use a function called an onclick() event to open links when clicked. The onclick() event then calls JavaScript code which tells the browser what page to open. That code can either be on the same page, or it can be embedded in separate file. Currently, if the code called by the onclick() event is on the same page, Google will process the code, crawl the URL listed and pass anchor text and PageRank to that URL. However, if the code is in a separate file, then Google does not process it.

Here are some examples of code that Google can understand, with links that will pass both anchor text and PageRank:

<div onclick=”document.location.href=’http://foo.com/“>
<tr onclick=”myfunction(‘index.html’)”><a href=”#” onclick=”myfunction()”>new page</a>
<a href=”javascript:void(0)” onclick=”window.open (‘welcome.html’)”>open new window</a>

Remember that, even though Google’s can process these links, JavaScript is not the ideal format for your links. Neither Yahoo nor Bing can read and process these JavaScript links. In addition, JavaScript links generally fail to display properly on mobile devices and screen readers.

You should also be aware that using JavaScript to cloak (aka, hide) links for PageRank sculpting or to prevent paid links from being indexed, now requires that you move your code to an external file if you want to prevent Google from finding those links. Or, you can simply add the nofollow tag to links that you’d like hidden from Google in order to facilitate PageRank sculpting.

Another option is to put the script contents in a remote.js file. Here’s how: In your .html page, reference the remote.js file like this:

<script language=JavaScript src=”javascript/remotefile.js”type=”text/javascript”></script>

Then place your JavaScript code in the remote file (i.e., remotefile.js). The bottom line is that there are plenty of other ways to spruce up your pages. Besides, studies have shown that complex JavaScript and Frames in general tend to actually reduce sales conversions.

2. Macromedia Flash

  1. These are those animated, eye-pleasing, motion picture style, high end Web pages and they cannot be easily optimized for search engines. So you put your site at a ranking disadvantage if your site’s architecture relies heavily on Flash. Even though there are work-arounds and exceptions, in general, search engines have difficulty indexing Flash in any meaningful way. As you know, your keywords provide one of the most valuable elements for search engines to determine what your Web pages are about. However it’s difficult-to-impossible for search engines to reliably extract keywords from Flash files. This means that any part of your Web page that uses Flash will generally NOT lend itself to top rankings. However, you can still use some Flash on your pages as long as you observe these guidelines:
  2. Don’t make your entire page one big Flash file. Make sure your page has abundant indexable content outside your Flash file.
  3. If you’re just using Flash to animate part of your page, and the rest of your page is in normal HTML and contains your keywords, then search engines will know what your page is about by reading that HTML (even though they’ll likely ignore the Flash). However, if most of your page is embedded in a Flash file, then it will be very difficult for a search engine to know what your page is about. That puts your Web page at a serious ranking disadvantage.
  4. Use the <noembed> tag. This is a good approach to take if you simply must create all Flash pages. Flash programmers know that any link to a Flash file must be enclosed in an <embed> tag. HTML also contains a <noembed> tag. This is where you should put the HTML version of whatever Flash you’re using on that page. Not only does this give the search engine something to read, but it also provides an alternative for those users who don’t have Flash installed in their browsers. Although Google is getting a little bit better at indexing Flash pages, they still don’t do it well. So don’t count on Flash pages to put your site on equal footing with non-Flash pages. You’ll be at a disadvantage, even with Google. Sure, there are sites like Oprah.com that use heavy amounts of Flash and do quite well in the search rankings. But that’s generally due to brand recognition and the accumulation of tons of links that propel them to the top of the rankings in spite of how unfriendly their site architecture might actually be to search engines. If you’ve got an Oprah-sized brand, and you really want that animated homepage, then by all means take the Flash route.

The bottom line is that Flash pages will always be disadvantaged in the rankings. But, if you must use them, then use the methodology outlined above to make sure the keywords you want indexed can be found outside your Flash files.

Lesson 6 Review

In this lesson you learned:

  1. The importance of designing search-friendly pages using the correct site architecture that, ideally, places every page on your site no more than two links deep, and three links max.
  2. How and why to keep your URLs simple.
  3. The importance of managing your Session IDs and Dynamic URLs in ways that do not confuse the search engine spiders.
  4. You’ve been taught the ONLY correct way to configure your Meta tags and learned about the dangerous Meta-tag-myths that inexplicably continue to survive.
  5. How to control the way your Web page displays in the listings with Rich Snippets.
  6. All about Sitemaps; both onsite HTML sitemaps and XML Sitemaps that help get your Web pages more efficiently indexed.
  7. How a simple robots.txt file can be used to keep spiders from indexing your unimportant, or potentially confusing Web pages. This enables the spiders to focus on indexing only your important pages.
  8. The pro and cons of using Frames, JavaScript, and Flash as any major component of your Web site’s architecture.

By now you are rockin! You’ve come a long way baby and just for fun, the next lesson, THE FINAL Lesson, gives you a peek at the pinnacle of SEO from the perspective of the SEO Expert.

How to Kill Your SEO Rankings

web-design-kills-seo-smGoogle has created a lot of rules and policies over the years to make sure sites play by their rules. Sites that ignore them are ritualistically slashed from the search results on an ever increasing basis. These almost monthly massacres are going after more and more sites that are all doing the same things wrong!

This article is a new twist on the top things you can do to become one of the many nameless victims left after the next algorithm update (1…2…Google is coming for you…3…4…lock the door.)


Don’t be fresh and original. It’s like a horror movie plot you’ve seen 100 times when the pretty young girl runs screaming up the stairs instead of out the front door when the man in the mask shows up. Instead of writing your own original articles just copy articles from various other sites or ‘spin’ your content into gibberish keyword rich phrases that no human can understand.

If a Web site has a really good article, why try to compete with your own original content? You can just copy and paste it to your site and BAM content added. While you’re at it, lots of Web sites sell PLR (private label rights) article packs that have been resold and ‘spun’ hundreds of times. You’ll never have to write again if you post them onto your site and through the article directories without revising any of the content. Cutting corners like this means you’re guaranteed to be publishing duplicate content.

PrintWhen this happens, your site will be flagged by the duplicate content filters on Google and ranked lower than the sites that you’ve copied from.

Another way to ensure those rankings will plummet into the pit of despair is by writing original content but never updating it. You’re busy and have more important things to do. Leave the content up there, even if the events took place four years ago. Not only will your customers love reading about things that have already taken place but Google won’t have any new content to crawl. Your rankings won’t increase and the spiders will visit your site less and less.

** Don’t let your site turn into a gnarly dried up corpse. Take some clues from a couple of the many articles we’ve written regarding updating regularly with quality content here and here.

Keyword stuffing. Again it’s that moment when the killer in the latest instalment of a movie series has been shot, stabbed and buried and the last scene is that hand coming up out of the ground as the audience does a collective eye roll (Friday the 13th Part 13 – seriously?!).

There’s nothing that readers love more than reading an article that has the same keyword in there 30 or 40 times. For example if your keyword is video marketing be sure your content reads something like this:

“Video marketing is one of the best video marketing strategies to get hits on your Web site. When you find video marketing companies, make sure their video marketing strategies are…”

If that doesn’t do the trick you can develop a great looking site and add SEO copy at the bottom that provides absolutely no value to your readers. To try and lose a few extra spots in the ranking, make sure that you really cram those few paragraphs with as many keywords as possible. The search engine algorithms have been adjusted to look for these tactics, so you can certainly trash your rankings by following these steps.

** That’s why keyword stuffing was our first spamming technique to avoid like the plague in this feature article The Top 10 ‘On-Page’ SEO Spamming Tricks to Avoid Like the Plague.

Continue to think that panda and penguin are two animals at the zoo. Here’s a quick hint if you’re confused – Panda and Penguin are two of the latest updates to Google’s algorithm to make sure that Web sites are naturally integrating SEO. If you ignore all of the latest news on SEO and don’t change anything about your site just wait, perhaps both Panda and Penguin will get their fangs into you.

** Or if you’d like a silver bullet in your arsenal check out this month’s update to get links to all the most current topics regarding Panda and Penguin.

Make your site all about the Ads. Ads make you money so the more ads the better…right? Wrong, sites with excessive ads being displayed above the site fold (Using more than 3 AdSense units per page? You may want to lower that) are being dropped from the rankings quicker than they can wonder what happened to their ad income.

** Hit the right balance of ads and content with hints from this byte How much content do you have on your home page? Is it enough?

Use IP Delivery: aka, Cloaking. IP Delivery, most commonly referred to as cloaking, is one of the most controversial and complex SEO strategies. We’ve covered it often over the years. The root of the controversy stems from the fact that IP Kill Rankings delivery enables a site to serve one page to human site visitors and a different page to search engine spiders. When used in this fashion, the site is said to be “cloaking” (i.e., hiding) the real page – the one that human site visitors see – from the search engines.

** Search engines do not like being served a page that is different from the page that people are seeing. That being said, if you don’t understand the technology – don’t use it because when done incorrectly it’s like taking a sledge hammer to your rankings. It’s a slow recovery and the Googlebot is a fickle host who wants what it wants.

Use Doorway pages. Aggressive SEOs have been known to create large numbers of low-quality pages with the intent to rank well for as many keywords as possible. Often these so-called doorway pages are automatically generated using software designed to optimize each page around a specific long-tail keyword. The goal is to funnel visitors from these doorway pages into a smaller number of highly converting sales pages.

** Of course, it’s fine to use content management systems and other software to manage your site as long as your actual content is being generated by humans. Obviously using software to automatically generate your content and send visitors from one site to another is another great way to get the Googlebot to send your site into oblivion. This will also make sure you violate Google’s Terms of Service so that you even risk being removed from the search index entirely. Ouch.

Use graphics and scripts. Instead of using actual text for your Web site, you can decrease your rankings easily by presenting everything on your site through the use of graphics. Hire the most expensive graphic designer you can find and focus only on the visual aspects of your Web site. You will have the most attractive Web site on the Internet that isn’t getting any kind of traffic and probably runs more slowly than the zombies from Shaun of the Dead.

** If you’d like to give your site a fighting chance against the invasion check out these resources for ways to incorporate things that add visual interest like well-optimized images and infographics without bringing your load time to a screeching halt.

Don’t use keywords. Keywords are for people who actually want to be found on the Internet. When people enter keywords that have anything to do with what your site is about, make sure that you don’t include these throughout any of your Web content or in any of the on page SEO spaces such as your title and header tags. You’ll save a fortune in keyword research and since the entire Internet runs on keywords, you never have to worry about appearing in the top results of a search engine. Your site can be like Big Foot, always rumored about but never seen!

** Or you can bring your site to live amongst us and target your keywords with our three part Keyword Primer Series Part One, Part Two, and Part Three.

Build your site around a brand name that isn’t yours. If you infringe upon the copyrights of another company, you may not only experience a lawsuit but you may be able to get your Web site completely wiped from the indexes. All you have to do is create an entire site about an already established brand – or your competitor’s Web site and poof! Rankings gone.

** Use your own IP (intellectual property) and you should be in the clear. Check with your attorney if you are unsure on anything.


Participate in link schemes. It’s like that moment when you notice an overly large clown peering out at you from a storm drain. Logically you know clowns are creepy and that no matter what he’s promising it’s not worth climbing into the gutter for. Same goes for links – if it’s too good to be true, chances are it is.

The worst place to start your link building is using one of the various link schemes. That would include the good old fashion link farm, cross domain linking, obvious paid links, and of course linking to ‘questionable’ sites (online gambling, porn or certain foreign sites). Google polices link schemes heavily, so your site will drop down in rank considerably within a very short period of time. Even better, submit your site to dozens of link farms all at once to absolutely ruin your rankings and the scary clown wins!

** Punch that creepy clown in the face and review ways to mitigate your linking problems here or start right from the beginning with our advice in this byte.

Hide your text. What your readers don’t see won’t hurt you. And the search engines love to read all of that hidden text. What you want to do is make sure you have all of your keywords in row after row and then make sure the text color matches the background color so it blends right in.

For instance, if the background is white bgcolor=”#FFFFFF”, then the text font is also set to white as such: font color=”#FFFFFF” (of course, this can also be done with CSS instead of font tags). The end result is text that blends invisibly into the background color of the Web page. It effectively appears invisible to people viewing the page, but not to the search engine spiders that index all of the text found within the HTML source code. Do this for every page and you’ll be down in the rankings in no time at all. Because sometimes those things you can’t see can still hurt you.

** Just avoid doing this, there is not really anything more to say other than that the search engines do not like being fooled.


Keep ignoring your site’s load time. That’s right – keep ignoring the countless times we’ve lectured you on your site’s load speed. If you don’t know your site’s speed, then you’re no better than the girl in ALL of those movies that can’t out run the killer who never even breaks a sweat as he strolls along. Quit tripping over things that aren’t even there and act like you want to live.

** Google has made it 100% clear that your site’s load speed is a ranking factor. It should load between 0 and 5 seconds and if it doesn’t you need to start working on what’s going on under the hood. Pick yourself up off the ground and use tools.pingdom.com/ to be sure your site is fast enough to survive. Google’s serious enough about it they’ve even added site speed tools to their Analytics. If your site isn’t up to snuff check out this byte to troubleshoot so that your site loads before the lunatic catches up.

Ignore errors on your site. You’ll get around to fixing them…eventually. In the meantime, it will disrupt performance to your site, piss off visitors and cause you to index poorly.

** No big deal that both your Webmaster Tools accounts and our own Super Spider Tool will quickly lay out the issues on your sites in a matter of seconds.

As you can see it’s easy enough to ignore our advice and make a mockery of the SEO system, waste millions of marketing dollars and thousands of hours of intense online SEO work. If you’re doing even one of these 13 things we’ve outlined and you’re still ranking well then consider yourself one of the lucky survivors …for now.

Advanced Link Building Techniques

The #1 *Secret* ART Used By Top SEOs to Gather Links In Bunches!

Linkbaiting – posting content that attracts attention and links – is not a secret strategy. However, the ART of linkbaiting sure seems to be. In any case, it’s an art worth mastering because:

  1. Link Bait can be used as an extremely powerful form of viral marketing.
  2. Getting Links is the most expedient path to the top of the search results.
  3. Best of all, it’s a strategy Google tacitly approves of.

In today’s world of SEO, you just won’t find a better one-two-three Kung Fu punch! Nor is there any faster way to gain an (unfair?) advantage over your competition! But, before we get to the “ART” part, let’s spread out the canvas and look at the broad strokes.

6 Key Characteristics of an ARTful LinkBait Strategist

ARTful Linkbait Strategists do some or all of the following…

  1. They distill multiple sources of information into Condensed, Cumulative Reports – thus creating Valuable Resources. In some cases they develop the ability to generate these resources internally. In other cases they outsource the tasks. Either way, they post valuable lists, reports, or tools that are 100% informational, highly useful, streamlined, and easy to link to – all while presenting the material from their own perspective.Some of the best linkbait resources are simply lists and how-to guides. In our report: How to Use del.icio.us to Quickly Build Amazing Content-Rich Articles that Attract Links Like a Magnet! we discussed the details of creating such resources. This report is a step-by-step guide to dominating the search results and we highly recommend you read and act on it, if you haven’t already.

    It isn’t necessary that your linkbait be “amazing” to get links. But it must look comprehensive and be useful. Make a list of topics your customers are interested in and start cranking out articles that are loosely focused within your niche having titles like…

    • 100 Tools and Resources for…
    • 50 Tips to…
    • 100+ Resources to Help You…
    • 75 Online Resources for…
    • 30+ Simple Things You Can Do to…
    • 25 Ways to…

    Be sure to mine sites like Delicious and Digg for ideas on article topics that people will link to. If your target topic area doesn’t participate on Digg or other newer social networks, consider mining forums. Look at the posts that have the highest number of page views and replies, and how frequently questions get asked about a certain topic. That’s a clue into the mind of the forum participants and an opportunity to solve a problem with a link bait answer. Ideally, we suggest you crank out one of these articles on a weekly basis.

    At its simplest, an article can be “100 Resources for…” followed by a list of 100 sites, tools, and articles you’ve found online related to that topic, with a short summary of each. You’ll be pleasantly surprised to learn how many people tend to link to such articles and resource lists.

  2. They systematically monitor the News. They watch for relevant stories that they can chime in on. By lending their expertise to the topic in a timely manner, they present a voice on the subject that tends to gather links.As an example, let’s suppose that we sell remote controlled model airplanes. It’s pretty simple to monitor Google News for key phrases like remote controlled aircraft and, on a weekly basis, we can review news stories within our niche and then write about what is topically hot. Here’s one story that caught my eye…
    Stanford’s “autonomous” helicopters teach themselves to fly
    Stanford Report – The aircraft, brightly painted Stanford red, is an off-the-shelf radio control helicopter, with instrumentation added by the researchers…

    Any one of a number of slants could be employed while slicing and dicing this story. For instance, how will this effect the sport of remote controlled aircraft for the hobbyist if, in fact, the flying can be done by a computer that learns? …you get the idea.

  3. They’re constantly looking for opportunities to present a Contrary Perspective on a controversial topic. They argue a position that contradicts prevailing wisdom.One controversial perspective could be that the price of gas hasn’t actually risen much at all. Instead, it’s the dollar that’s fallen evidenced by the price increases across a broad range of commodities which might include homes, gold, imported foods, and so forth. The fact that the federal reserve has kept the interest rates so low for so long, and that they’ve been paying the government’s bills with printed money, could be used to support the premise.

    If you are a commodities broker or financial services specialist, this could be a contrary perspective for you to write about within this currently hot and controversial topic.

  4. They intelligently harness the power of Fear. This is my least favorite strategy (blush), however there are certain industries that lend themselves to this style of social promotion. Topics like personal privacy, ID theft, radar detectors, child care, insurance and so forth have successfully harnessed the ‘power of fear’ to promote their goods and services since their beginnings.Once again, the News Feeds are an excellent source for stories that can be re-reported in the light of an “expert’s” opinion (like yours).
  5. They employ the universal language of Humor. Ok, if the iPhone in the blender didn’t seem funny to you, then maybe you could take the Jeff Foxworthy approach:
    • If you cringe whenever you see a site listed as ‘Untitled‘ in the search results above your listing, then you might be an SEO.
    • If your response to your spouse’s inquiry regarding your son’s whereabouts is: ‘he’s 404‘ …then you might be an SEO.
    • If you’ve ever spoken in terms of having to ‘301 redirect‘ your focus from one task to another, then you might be an SEO
  6. And, in every case…They develop their social networks – This one was too big for a single bullet point, so read on..

Baiting the Linkbait Hook

It’s not enough just to publish your linkbait – you’ve got to get it seen by the people who are most likely to link to it. That’s where social media comes in. StumbleUpon is particularly effective for generating significant traffic to your articles.

Other social media sites such as Delicious, Mixx, Propeller and Digg work well too.

In any case, it isn’t realistic to expect your content to leap to the top of the social media sites UNLESS you become, and remain, actively involved. That means regularly submitting and voting on stories and building a network of social media friends.

We go into extensive detail on building a friends list in our StumbleUpon report. We highly recommend you study that report closely – while bearing in mind that all major social media sites have their own version of the friends system. Most importantly…

Understanding how “friends” networks operate
is critical to your social media success.

Another characteristic of ARTful linkbaiters is their disciplined approach to staying involved. At least a few hours each week are dedicated to maintaining their social media profiles whether done personally or delegated to company personnel.

We know of at least one instance where an ARTful linkbaiter invested the time to identify the active users within her niche. Then, once her relationship was established, she contacted one of those power users directly and hired him to submit content on behalf of her company.

If you go this route, be sure to frame your offer delicately because some people are easily offended. It’s entirely possible they may report you to the social media site. In such a case there’s a chance your site could get blocked from that social media site. But, generally, if you spend some time on the site interacting with, and getting to know the people, you can get a pretty good feel for which users might be open to this kind of arrangement.

By the way, we’ve found that advertising for social media experts via traditional outsourcing sites like Elance or RentaCoder has always been a bust. So your best bet is to hire/train someone in-house with your second choice being the ‘hire an outside social networker‘ route. You may also find talented social media marketers via Jobs in Social Media, a site which we haven’t yet used, but looks potentially useful.

Here are some great example sites doing all the right things to become link magnets…

  • Techmeme’s Leaderboard – This lists the sources most frequently posted to Techmeme. Seems simple enough but this is a link magnet! Once Techmeme added this resource, all the sites that appear on this list have linked to the page at one time or another. Some of the sites have multiple high page rank links pointed here. Techmeme provided a simple way for sites referenced on this list to show instant credibility.
  • Vimeo – Similar to YouTube, this is a simple online video sharing site. They cleverly set it up so that once the player is embedded, Vimeo instantly gets 3 links back to their site. In return the user gets an great player, that is easy use and reliable. This widget/tool strategy is a solid one that is used by countless highly successful sites.
  • OK Cupid’s Blog – This is a highly creative blog that accompanies a online dating site. It has content that is very unique and compelling on one of the all time hottest topics – understanding the opposite sex! This blog is accomplishing being both “linkbait” and a “link magnet” and standing out from the mob of online dating sites.

The LinkBait Magic at work…

You should bear in mind that, while your linkbait pages will see an immediate boost, improved rankings for your main landing and product pages won’t happen right away. But if you consistently apply the strategies outlined in this report, your home page and product pages will also rise in the rankings. Here’s how…

  • First, any link you get to any page on your site will help all pages on your site rank better. That’s because these additional links increase the overall link authority of your site.
  • Second, by linking these linkbait pages to your main product pages, the link juice flows from the sites linking to you, through the linkbait page, and to your product pages.

It’s realistic to anticipate that each linkbait content you create (if well done and properly promoted) will typically average somewhere between 10-100 natural links from other blogs and Web sites. You’ll also get a bundle of links from the social media sites where you promoted your content since every user that voted will have a link to your content in their profile.

And, even though links from social media sites themselves tend to be nofollowed, they will still provide a small rankings boost – especially in Yahoo and Bing since neither of these engines recognize nofollow links as well as Google does.

Consistency and Persistence within your Social Network is Essential

By faithfully creating some form of interesting link-worthy content on your site every week or so and promoting it to a large network of social media friends, you’ll be (from the search engine point of view) continually building natural, unrequested links to your site’s content pages. This is far-and-away the single best strategy for ranking at the top of the search results.

You’ll often see increased search engine traffic right from the very first article you create and promote. And, even though Google says they don’t use the traffic data from the Google Toolbar to determine rankings, nearly every time we’ve created an article and gotten a surge of traffic from social media, that article ranked well and pulled in Google search traffic almost immediately. (go figure!)

So what’s the big secret?

Well, the sites that started LinkBaiting a year ago have significantly higher rankings today. And, if that’s you, then you know the secret. As with any strategy, you have to actually do it. And, since the hardest part is getting started, then I suggest you click yourself over to Google News right now, enter your keywords, grab a story and begin writing your ‘Editorial Opinion’ on that story today.

Tomorrow you can post it on your site and then begin building your social network of friends (if you haven’t already done so). Then, shake, repeat, and blend the different strategies as you develop your ART – this time next year, you’ll have experienced the secret – and be the one who’s smiling BIGtime



Optimizing Press Releases

5 Top Sites to Submit Your Optimized Press Release

By now you’ve learned why optimizing a press release for the search engines is important, how to optimize the release for both SEO and SMO maximum visibility and what the benefits of this optimization can mean for your business. Now comes the last step—the actual distribution of the press release.

There are literally dozens of both paid and free sites that will distribute a press release online these days. We’re going to look at the top 5, which control over 95% of the 2000+ releases distributed daily.

  1. Business Wire — A Berkshire-Hathaway company in business since 1961, Business Wire is the largest newswire dissemination service in the United States. Its ability to target your release is quite impressive and it has the latest in SMO bells and whistles (through a partnership with PR Web).Targeting can be either by industry, geographic locale or a key demographic. Membership is required (at $200 per year), and releases are priced higher than most other services at between $500-$600 per release. Often this higher price also gets you exposure to the bigger media outlets.
  2. Market Wire — A close second in market share to Business Wire, Market Wire has over 3,000 corporate clients and distributes releases to over 60,000 journalists daily. Market Wire is also the exclusive provider for press releases to the NASDAQ and NASDAQ-listed companies.If you’re in the finance arena, Market Wire is probably the best choice for you. Market Wire has traditional SEO capabilities and has now improved their SMO features considerably. Membership is required (at $200 per year), and also starts at around $500 per release.
  3. GlobeNewswire (formally, Prime Newswire)— Known for its large distribution network and its focus on finance and stock-related news and companies, GlobeNewswire has had a lot of industry firsts. These firsts include a free online clipping service, an online customer service center, and delivery of daily news alerts via HTML email. No membership is required. However, a full SEO/SMO optimized wire release is expensive here and requires a $250 upgrade on top of their normal $550 distributed target release for a total of $800.
  4. PR Newswire — This service averages over 1.9 million views on its Web sites each month. PR Newswire distributes between 700-800 press releases a day and among its many SEO friendly tools is a nice account-level keyword density analyzer that automatically flags your release for suggested optimization changes.PR Newswire requires a membership fee of $195, and $400 per release which includes 30-day tracking, a clipping service and a full suite of SEO/SMO and multimedia customization features.
  5. PR Web: This site is definitely the most user-friendly of private press release distribution sites, used by over 40,000 organizations internationally, and our personal favorite. As this sample release from their site shows, they have integrated practically every SEO/SMO optimization widget possible within their releases. PR Web offers four categories of releases ranging from their standard $80 basic visibility package all the way up to their premium visibility package which runs $360 per release.

So which Paid Press Release Service Should You Use?

We recommend PRWeb as your first choice in SEO and SMR press release distribution.

The recent declaration from Google aside, we still feel there is substantial value in the continued use of press releases. Although their link building power has clearly been diminished, their ability to generate NEW press mentions and social media signals are both still invaluable. Finally, remember to optimize your press releases both for the search engines and the social media networks and you will more effectively reach both your target audience and those news professionals looking for hot, viral content.

Lesson 5

Competitive_AnalysisWe all know that, before we optimize our pages for the search engines, we need to find out exactly how sophisticated the competition is, or isn’t. Our goal is to determine what the top ranking pages are doing within our keyword categories so we can look for weaknesses while quantifying their strength. Then we duplicate their efforts before going one better in order to achieve our success in the rankings easily! One problem is that, ordinarily, such competitive analysis is time consuming. Another problem is that the time consuming analysis needs to be repeated again and again for each site and each competitive keyword. Then, when compiling each metric one by one, it’s the kind of task that can make you want to tear your hair out (is that why so many SEOs are bald? …just wondering).

9 critical competitive data points and delivers an overall site strength score.

In other words, in one click, and all at once! …we learn:

  1. Domain Registration Date: This refers to the age of a site. The older the site, the more of an advantage in the rankings it will have. Google likes sites that are aged. If the site’s ‘birthday’ is a year or two old, the site has a slight ranking advantage over a newer site within this data-metric. Older sites tend to have developed more links over time and inherently are more trust worthy than a brand new domain name. If you click the date displayed, the SSI tool will display the WHOIS data for the domain name.
  2. Google PageRank: This refers to the Google Toolbar PageRank that rates the importance of a Web page in the eyes of Google on a scale that runs from PR=0 to PR=10. This metric is important because it tells you how valuable a link coming from a site may be. It also gives you a good idea of Google’s opinion of a site. Note: The Page Rank displayed is for the URL that you entered. It’s not limited to the home page.
  3. Google Cache Date: This metric tells you exactly when Google last crawled a site. A recent date is indication that Google thinks the site is important. On the other hand, if a site’s cache date is a month or more old, then Google probably thinks very little of the site’s overall importance. Very important sites are crawled frequently, even with moderately popular sites some URLs are crawled daily. Unimportant sites are crawled sporadically, perhaps every couple weeks or once a month.
  4. External Backlinks: Exactly what this appears to say. Although you shouldn’t expect this number to be perfectly exact, it will most certainly be relatively exact. Basically, if one of the sites you are analyzing is showing 19,576 links and the other site is showing 610, there is a significant difference of inbound links between the two sites. If you click on the Backlink result, you’ll expose a link to Majestic Site Explorer’s Backlink summary report. On this site you can observe the URL’s total links, and the links that have been discovered in the last 30 days as well.
  5. External .edu Backlinks: Since .edu domains are restricted to accredited institutions of higher learning (typically U.S. Colleges and Universities), Google often trusts these links considering them less likely to be commercially oriented. One way to boost a site’s ranking is to collect links from .edu domains. The more .edu links the better.
  6. External .gov Backlinks: Likewise, .gov domains are restricted to U.S. government entities. Google also tends to trust these links. One way to boost a site’s ranking is to collect links from .gov domains. The more .gov links the better.
  7. Referring Domains: This value reflects the number of different domains that are linking to the site. The higher the number, the better. More unique links from different referring domains than from the same domains. A site that has 1000 inbound links from 500 Referring domains is typically going to be far stronger than a site that has 1000 links from 10 Referring Domains.
  8. Referring .edu Domains: The number of unique .edu domains linking to the URL entered. This value works the same way as the Referring Domains Entry, except this is only the value for .edu domains.
  9. Referring .gov Domains: The number of unique .gov domains linking to the URL entered. This value works the same way as the Referring Domains Entry, except this is only the value for .gov domains. As you test the strength of sites using the Site Strength Indicator, you will also notice a Total Score is assigned to the site being analyzed.

Here’s the scores explained: 0 – 19: This site has a very limited search engine presence and is getting far less search traffic than it could if it increased its optimization. Building quality links is often the quickest way to improve search engine rankings. 20 – 39: This site has made some progress in achieving rankings, but is still far below achieving its potential. Targeting long-tail keywords is your best bet for traffic at this level. Link development should still be continued at this level as well. 40 – 59: While there is still a lot of work to do, this site is on the verge of breaking into the arena of major search engine presence. Creating some viral buzz could be enough to push it into the big leagues. A site at this level should have enough traffic to also take advantage of UGC (User Generated Content) such as comments, reviews, or forums to further enhance its ranking. 60 – 79: This site is a powerful presence on the Internet. In a small to medium niche it’s likely dominating, but can still be beat with the right optimization. 80 – 100: This site is among the most powerful and authoritative sites on the Internet. If this is your site, then congratulations! If it’s your competitor’s site, they’re going to be very tough to beat. The Total Score ratings, explained above, give you a pretty good idea of how easy or hard it will be to supplant a site that is ranking on the first page of the search results. Let’s suppose, for example, we want to place the site: www.homeschoolviews.com somewhere on the first page of Google’s search results for the keyword homeschool. Let’s take a look at the sites that are ranking #’s 1 through 10. We want to see if any of them are beatable in the rankings based on their lack of overall strength.

Tips Tricks and Traps of the SEO Trade

star_wars_its_a_trap Discussions regarding acceptable and unacceptable SEO practices often devolve into a polarized conversation that splits into two groups: White hat vs. Black hat SEO practices. However, we think it’s a silly conversation. That’s because, unless you are doing something illegal (which we do NOT recommend!), we think the more productive discussion should center around: What works vs. what doesn’t work & Safe vs. dangerous SEO tactics As an SEO, you need to know what SEO tactics are working. This is especially true if your competition is using it. Never mind that somebody out there is crying ‘black hat’ …that, alone, should be irrelevant to you. If it isn’t illegal, you can expect that your competition will use it against you to gain a competitive advantage. Bank on it! The REAL question should be: Is the SEO strategy safe? In other words, will your site be penalized or banned if you get caught applying whatever specific SEO strategy to your Web site’s pages? If the answer is YES, then the advise you to refrain from implementing an ‘unsafe’ SEO strategy. Example: Let’s suppose that you learn, while conducting competitive analysis of your competition, that they are hiding keywords under their images to artificially boost relevancy in the search results. They are also hiding links that direct spiders toward topically unrelated pages for indexing and PageRank purposes. And, as far as you can tell, it appears to be working due to the fact they are ranked on the first page, in the top ten of the search results. In addition, all of the pages their hidden links are pointing toward are indexed and ranking well too. Now, because you have carefully studied our Advanced SEO Tutorial; The Top 10 ‘On-Page’ SEO Spamming Tricks to Avoid Like the Plague! …you know for sure that Google does NOT approve of such a tactic. It clearly conflicts with Google’s Webmaster Guidelines which specifically states that hidden text or hidden links should be avoided—including text behind an image. However, one must consider that there isn’t anything illegal about hiding text or links behind an image. So, if your competition is using this tactic to compete, then you at least need to know about it. Now you have some options to consider.

  1. Ignore it and take the safer SEO tactical road hoping that Google will eventually discover the “cheat” and penalize your competitor’s site accordingly.
  2. Report your competitor’s guideline infractions to Google and hope that Google evens the playing field by penalizing your competitor. (Don’t hold your breath).
  3. Match your competition tactic by tactic (regardless of Google’s acceptable use guidelines) knowing full well you’re engaging in an unsafe (although not illegal) SEO tactic.

So, what’s our recommendation? We prefer choice #1. Although, we realize you may choose choice #2 for reasons that are entirely understandable. We do not recommend choice #3 for the simple reason that there’s no guarantee whatsoever that Google will refrain from penalizing your site while leaving your competitor’s site unscathed. Yes, we know—unfair! …but it happens and there is no easy way to predict what Google will actually do in such circumstances. You also need to know there is something called ‘white-listing’ a site. In essence, that’s when a site is considered so trustworthy by Google that they never assess any ranking penalty regardless of whatever SEO tactics such white listed sites may be using. Typically, such sites belong to brand name companies like Microsoft, Nike, Amazon, and so forth. So, if you’re competing against one of these so-called brand names, then be aware that they may be operating under a less strict set of rules. Yes, we know—again, not fair. But it is the way it is. Make your adjustments and move forward. Here’s the point. The Profession of SEO is full of tricks and traps—and you absolutely must KNOW what each of them are in order to compete on the big stage. That’s why we have devoted an entire Advanced SEO Tutorial to the Top 10 ‘on page’ SEO tricks that you absolutely positively need to know about if you are to successfully analyze what your competition is doing to achieve top rankings. When you are ready, be sure to carefully study:

What’s In a Name?

choosing the right domain name

As you’ve no-doubt deduced by now, choosing the right Domain Name can be critically important! Not only does Google currently give preference to keyword-rich domains, so do people when they’re choosing which link in the search results to click. Another critical truth to factor into the equation is that you should AVOID changing the domain name of an established site. Once a site (domain name) is established and ranked within the search engines, it is NOT advisable to switch the domain name. That’s because switching from an established domain to a new domain can be a tricky process at best. At worst, you will lose all of your investment in time and effort that you’ve spent promoting the site. In essence, you could find that you are literally starting over. Not good. This can often put your site at a ranking disadvantage for upwards of six months to a year. What’s more, Google places a high value on old sites—the longer a site has been around, the better. In fact, two of the most important ranking factors in Google are:

  1. The quality and age of your incoming links (i.e. the importance of the linking page combined with the length of time the link has been pointing at your site).
  2. The age of your site (the length of time your domain name has been online).

Compared to having an old and established site, your domain name is often way down the importance-list of ranking factors. So, given a choice between a great domain name and a site that’s been aged, we’d advise the latter—the older site over the choice domain name. Although, it is possible to efficiently and properly move sites, (something we’ll discuss in the next lesson) our general advice is to *not* change a domain name unless you absolutely have to for legal reasons (such as a domain name containing someone else’s trademark) or if the new domain is absolutely so amazing that you can’t pass it up. Otherwise, we advise that you stick with an established domain name if you already have it up and running with incoming links and indexed content. But if you are starting fresh, then you should make every effort to get the best, most keyword-rich, domain name possible. The ideal domain name exactly matches the primary keyword or short (2 or 3-word) keyword phrase you’re targeting. Examples of such domain names might include:

  1. http://www.HotTubs.com
  2. http://www.PetSupplies.com
  3. http://www.SalesTraining.com

However, in most cases, these keyword-exact-matching domain names are no longer available on the primary market. They are already being used (or held) by other businesses or professional domainers people whose business it is to buy and sell domains for a profit on the secondary market. (Note: domain names are not case-sensitive. We’ve capitalized letters in the above domains to make the keywords within them easier to spot.) This means that really good domain names are usually not available for cheap from a domain registrar, but instead, must be purchased directly from someone who already owns the domain. Oftentimes you can contact an owner of the domain directly via the domain’s WhoIs info or by using contact info on the site (if it exists). To check the WhoIs info (the background information on who owns a site), use a site like domaintools.com: http://whois.domaintools.com/. There you can simply enter the domain name for which you would like to check for availability. You can also buy already registered domains on domain auction sites. Some of the best include:

  1. http://buydomains.com/
  2. http://www.snapnames.com/
  3. http://www.enom.com/
  4. https://www.tdnam.com/ (GoDaddy.com’s Domain Name Aftermarket)

Bear in mind, however, that many times these resale domains can cost several thousand dollars to purchase. And, while having a domain name that exactly matches your most sought after keyword can definitely help boost rankings, it’s not absolutely essential to building a high ranking site. Instead, you might choose a short, catchy name that will be easy to market, both online and off. Just a few well known examples of domain names that have been turned into globally recognizable brands include Amazon, eBay, Google, MySpace, Facebook, Twitter and del.icio.us (now, delicious.com). Besides being easier to build a brand around, the shorter name also provides the advantage of being easier to market through other media, such as radio, print, or television. Sometimes you’ll get lucky and find a domain that uses your keywords and is short and catchy. If so, grab it! You’ll have the best of both worlds. The drawback to the short, brand-able domain name is that, since it lacks keywords, it supplies absolutely no search engine ranking benefit. From an SEO perspective, it does nothing for your site’s rankings. Your other option is to buy a domain that incorporates your keywords but doesn’t exactly match the keyword phrase you’re targeting.

Examples might include names like:

  1. http://www.HotTubSource.com
  2. http://www.PetSuppliesCenter.com
  3. http://www.SalesTrainingExperts.com

While you may not get the often significant rankings boost you’d get if your domain exactly matched your most sought-after keyword, using a domain name that incorporates your keywords in some variation will still contribute to high rankings for the following reasons: Whenever another site links to you, you want them to use your keywords in the visible text of their link (the anchor text). Having your keywords in the anchor text of links pointing to your site is one of the most important aspects of high-ranking pages. Most of the time when other sites link to you they will typically use your domain name as the anchor text. If your domain name already includes your best keywords, that makes it natural for them to use your best keywords in the anchor text. It also saves you the trouble of having to track down those links and request the site to change the anchor text—a task that is awkward at best. In regards to visitor click-throughs, having your keywords in your domain name can make a big difference. When browsing through the lists of links on search engine results pages (SERPs), our studies have shown that people are far more likely to click links that contain the keyword they are searching for in the domain name. Even if your competitor provides otherwise similar-to-equal enticements, if they’re lacking the specific keywords in their link, their link is more likely to be passed over. This selective click-through behavior especially pertains to web users who access the net via slow Internet connections. Because of the sometimes excruciating delay between clicking and page loading, users on slow connections tend to study each URL more closely. They look for keywords and evaluate beforehand whether or not a link appears to be worth the time investment it takes for a page to load on a slow connection. As long as your keywords are somewhere in your domain name, search engines will generally assume your domain is relevant for those keywords. The exception to the rule: excessively long and/or multiple-hyphenated keyword-stuffed domain names. For example, a plastic surgeon specializing in breast enhancement in Beverly Hills might consider registering and using this as one of her domain names… www.cosmetic-surgery-breast-enhancement-beverly-hills.com At one time, such a domain had a boosting effect on rankings. But that effect has since been radically diminished in respects to domain names with excessive dashes and characters. As you can see, the above domain name has 49 characters including five hyphens (dashes). Too many characters and too many dashes make this domain name look unnatural or spammy from the SEPOV. Therefore, we recommend no more than one hyphen at most—preferably none; and to create the shortest domain name possible based on what’s available and what makes sense to your potential site visitors in relationship to your business. In a perfect scenario, your best domain name is typically your primary keyword or keyword combination. Whenever that isn’t possible, at least try to get your most important keyword inserted somewhere into your domain name. The other drawback to domains with hyphens is that they’re harder for customers to remember. They can also be more difficult to convey by word-of-mouth since people have to say “cheap dash art dash supplies dot com” instead of just “cheap art supplies dot com.” When we purchase domains names, we often buy both versions; with and without dashes separating the keywords. By purchasing both versions, you can keep these alternative domains out of the hands of your competitors. It’s also a good idea to register common misspellings of your domain name, especially these days when annual domain fees are so affordable; about $10 a year at GoDaddy.com. Misspelled and other alternate versions of your domain name can then be redirected to your primary domain name so that anybody who inadvertently goes to the “wrong” URL will still end up in the right place (don’t worry, we’ll cover URL redirection in the next lesson). However, you should avoid registering domains with double–dashes separating keywords, like: www.art–supplies.com Not only does this make it even harder for your customers to remember your domain name, but search engines may not even index your sites that use double dashes in the domain name. The reason why is because this strategy has been abused in the past. Overall, buying the .com, .org, or .net version of a domain name that exactly matches your primary keyword phrase is one of the most effective strategies for building a potentially high ranking site from the ground up.

Domain Names that Please Customers and Search Engines

When designing your site, striking the right balance between a site that search engines like and a site that your customers like can be tricky. Unfortunately, choosing a domain name falls directly into that same tricky category. Here are a couple of guidelines to help you choose: If your intention is to build a long-term online business that will be marketed through a variety of online and offline media, then go with a short, catchy, trademarkable name that you can easily build a brand around. OR If your intention is mainly to get ranked with the search engines, then go for the keyword-rich domain name. Regardless of your exact approach, getting your keywords into your domain name will always boost your search engine ranking—particularly when your domain name exactly matches the keyword or keyword phrase you’re targeting. Even if your company name is your domain name, such as PlanetOcean.com, it’s still a good idea to register other relevant keyword-rich domain names pertaining to your goods and services whenever you find them available. For instance, we also own… This gives us the option of setting up specialized sites that place our services in the paths of people who are using the search engines to locate our specialty product or service. It’s like buying online ‘real estate’ for the future. There’s always a good chance the time will come when you’ll want to develop these vacant ‘properties.’ This strategy also keeps these choice domain names out of the hands of future competitors, making it harder for them to enter your market.

Choosing a Domain Extension: .com? .net? .org? .biz? .info?domain names

When researching possible domain names, you should see what’s available in the .org or .net categories. These are given equal billing with search engines and it may be easier to find the domain you want if you target those extensions. You are likely to find they are generally more available to register, or else can be purchased from the owner for a cheaper price, than the .com version. If you’re outside the U.S., or targeting a market outside the U.S., country-specific domains such as .co.uk or co.in are also a good choice. Such domains will typically give you an advantage in ranking for queries performed by people in those countries. However, be aware that the advantage that .com holds over all other domain name suffixes is one of adapting to your visitor’s habits. People tend to assume that a site’s URL will end in .com, regardless of the fact that search engines typically don’t care one way or the other. In other words, if you don’t control the .com version of your chosen domain name, you could be losing the so-called type-in traffic (i.e., traffic that ensues when a searcher simply types their search directly into a browser’s address bar). If you can’t secure your chosen domain as a .com, then you may want to choose a different domain. Whoever controls that .com is going to end up getting some of your traffic. However, if your goal is primarily to rank in the engines, and you don’t mind that some traffic might bleed off when people inadvertently enter .com by mistake, then .org and .net domains are just as good, ranking-wise, and they’re generally easier to obtain. If your business is not based in the US, then it also becomes important to acquire a country specific domain name extension. For instance, if your business is based in the UK and you want your site to be found within Google UK’s “pages from the UK” search feature, then you must have either a .co.uk domain extension, or else your site must be hosted within the UK. However, even if you do have the country-specific extension, it’s still important that you also control the .com to avoid losing type-in traffic. You can then forward the .com version of your domain to your country-specific domain. Occasionally you’ll see domains with the .info extension. Our advice is to avoid this extension because it’s been heavily abused by search engine spammers. Such .info domains can be registered for as little as $0.99, making them a prime target for spammers looking for cheap, disposable domains. You can still build a high ranking site on a .info domain, and some people do—but our experience is that this domain-extension has been tainted from the SEPOV and is best avoided. An additional note about domain extensions: There is evidence that Google gives a ranking advantage to domain names with .edu or .gov extensions. However, these extensions are only available to recognized educational institutions or US government entities, respectively, and are off-limits to most of us (though they are extremely valuable sources for incoming links).


In this lesson you learned:

  1. The importance of having the right domain name.
  2. How to choose the right domain name.
  3. The importance of choosing domain names that appeal to both customers and search engines.
  4. How search engines view the different domain name extensions in respects to rankings.

Congratulations, you are close to completing the book portion of this SEO course. Only two more lessons to go! …now you’re ready to learn how Web Site Architecture effects your ability to rank well in the search engines.

Top Web Directories

Best General Directories

DMOZ Add Free Could take up to a year or more for your site to be listed.
Yahoo Directory Add $299/year for commercial sites, $600/year for adult sites Your site will also be listed in several international Yahoo directories.
Best of the Web Add $149.95/year or $299.95 one-time. As name indicates, was once an award site.
HotVsNot.com Add 6 months = $40, 12 months = $60, 3 years = $150, Permanent = $200 or Free with a reciprical banner ad. A top-tier directory with hundreds of thousands of backlinks, an aged domain, high traffic and very recent Google Cache dates on interior category pages. It also has a growing user community and DOES NOT accept every site.
JoeAnt.com Add $39.99 one-time fee, free for editors Takes about a week to be listed either way.
GoGuides Add $69.95 per URL. Sites are instantly included. Money back guarantee if your submission is declined.

General Directories

Directory Add Cost Comments
Universal Business Listing Add $75 per year Excellent local search directory.
Rubber Stamped Add $39.95 one-time review fee, refundable if rejected. Nice, growing directory.
Skaffe Add One time $49.99 fee, or free but only for submissions done over weekends. All submissions reviewed and not accepted will be accessed a $10 processing fee Becoming an editor is not difficult, and is an easy way to get listed for free.
Massive Links Add Directory Listing is $39,99 with 4-links – Full Refund if you’re not accepted – one time fee
Premium Listing is $79.99 with 4-links – Bigger display above basic listings for one year – renew and keep premium listing or it defaults to regular directory listing
Non-profits are still free
Permits you to write your own keyword-rich link anchor text.
Starting Point Add $99 annual fee. Quality directory, many high-PageRank pages. Not widely known, so may be a link your competitors won’t have yet.
Gimpsy Add $49 to be reviewed within 72 hours, $20 refund if site is turned down. Free submit available, but can take months to be listed. Focused primarily on commercial sites. Can be difficult to find the proper category to submit to.
Web Beacon Add $39.99 one-time review fee, free for editors. Originally created from the GoGuides database, expanded since.
Site Snoop Add $15 one-time fee. Then you can add up to 4 additional URLs in your listing foran additional $5 per URL. Currently does not accept mortgage/loan sites. There tends to be a lot of spam in that industry, so this helps improve the overall quality of that directory.

Top Social Sites

Social Site Join Type Comments
Twitter Join Social Communication Share and discover what’s happening right now, anywhere in the world.
Facebook Join Social Networking Facebook helps you connect and share with the people in your life.
LinkedIn Join Social Networking Over 85 million professionals use LinkedIn to exchange information, ideas and opportunities
Google+ Join Social Networking Share updates, photos, videos, and more. Start conversations about the things you find interesting.
Digg Join Social News Still a top social bookmarking site. Hitting the front page can bring massive server-crashing amounts of traffic.
Delicious Join Social Bookmarking Links are nofollow but the site is still a good source for traffic and to connect with other social users.
StumbleUpon Join Social News Can be a great source of ongoing traffic.
Pinterest Join Social Image Bookmarking Now the #3 social bookmarking site behind only Facebook and Twitter. If you are in a image-rich industry, this is a must-use for everyday social marketing.
eBaum’s World Join Social Video High-traffic social bookmarking site perfect for link bait submission and blog post promotion.
Folkd.com Join Social Bookmarking A top global social bookmarking site that’s been around since 2006 links are nofollow but can lead to good traffic if you hit the front-page.
BuzzFeed Join Social News Breaking top traffic social news site. Provides an advertising option to drive traffic to highlighted blog posts or link bait pieces for initial viral buzz.
Reddit Join Social News Community groups on many varied topics.
BlinkList Join Social Bookmarking Bookmarking service, also allows you to save copies of whole web pages.

Top Blog Directories

Blog Catalog Add Free. Probably the biggest and most successful blog directory currently
Eatonweb Add $34.99 Good blog directory, even if you have to pay.
Blog Flux Add Free. Links to your blog homepage are redirected, but links to internal posts are direct.
Best of the Web Blogs Add $149.95/year or $299.95 one-time. Great directory, we recommend both a regular listing and a listing for your blog.
GetBlogs Add Free. Free, but they like you to link back to them.

Top Article Sites

Ezine Articles Add First 10 articles submitted are free. If they like your articles, they’ll let you submit unlimited articles for free.
GoArticles Add Free. Unlimited numbers of articles can be submitted for free. Links allowed in sig line but not within content. As of now they are no longer accepting payment for featured articles.
Article Dashboard Add $47 Unlimited numbers of articles can be submitted once you pay.
Buzzle Add Free. Unlimited numbers of articles can be submitted for free. Must apply to become an author. Buzzle.com no longer accepts marketing type articles including those with external links.
Article City Add Free. Unlimited numbers of articles can be submitted for free. No anchor text links, just URL links.
Idea Marketers Add Free. Unlimited numbers of articles can be submitted for free.
Article Alley Add Free. Unlimited numbers of articles can be submitted for free. Can take awhile for your application to be approved.
Article Gold Add Free. Article Submissions and New Registrations are being accepted again after a major site update. Sign up disabled until further notice.
Amazines Add Free. Unlimited numbers of articles can be submitted for free.

Listings on the Search Engines

Google+ Local Add Free You will need to do a search and locate your business’ local profile with Google. Once you’ve clicked through to its listing there will be a note: “IS THIS YOUR BUSINESS?” with a Manage this page button to click. You will then be asked to either sign into your own Google account and if you don’t have one then you will need to register with them first.To learn more about managing and boosting your local listings with the major search engines check out our book, Get on the map! The Complete Guide to Getting Your Local Business Online.
Bing’s Local Business Portal Add Free Locate your business within Bing’s search results using the URL we’ve listed in the Add column. Then you will be allowed to claim it once you’ve either logged in or registered with Bing. Be sure to read our Your Complete Guide to Bing’s New Local Business Listing Portal to get started on the right foot.
Yahoo Local Add Varies Free for a basic listing or $9.95 a month for an enhanced listing.

Internet Yellow Pages (IYPs)

SuperPages.com Add Free Be consistent with ALL the data that you have with all your listings. Especially when you’re listing with any Internet Yellow Pages because many businesses will pull your data from these places and if it’s incorrect it will negatively effect your local rankings.
Yellowbook Add Free You first need to register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Yellow Pages Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile. You will be asked to register for their site or you can opt to login with your Facebook login profile. We suggest creating a separate profile for any business listing that you’re doing for clients. It’s even advisable to create unique login details for your own business in case you outsource the management at a later date.
dexknows Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile. You will be asked to register for their site.
Yellowbot Add Free Start with a search for your business using your business’s phone number at the bottom of the homepage. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile. You will be asked to register for their site.
White Pages Add Free This site is currently powered by Yelp.com with the map provided by Bing so if you have a listing with Yelp.com then you’re good to go. Start with a search for your business. If it does come up then you will be able to click through to Yelp to edit it as needed. If it doesn’t come up then click the Add link we’ve provided you to get it registered.

Best Local Directories

CitySearch Not taking new additions at this time. We will keep you posted. Free This site is a very important data source for local search. It feeds many other sites information in businesses. This site is powered by CityGrid so if you have a listing with them then you’re good to go. Start with a search for your business. If it does come up then you will be able to click through to edit it as needed. If it doesn’t come up what you do next is something we’ve contacted CityGrid about and are waiting to hear back. We’ll update you here as soon as we know more.
Yelp Add Free Yelp powers Bing’s Local Listings so this is a site you cannot afford to ignore! You first need to register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Merchant Circle Add Free You first need to register as a user and then as the owner you can add your local business to their listings. Always start with a search of their site for your business because they have most businesses already pulled in and unclaimed.
MojoPages Add Free First, you will need to register as a user. Then you can do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Local.com Add Free You first need to register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Kudzu Add Free First, register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
HotFrog Add Free You can create your profile right away then access it anytime to update your products and services, add images and announce special offers.
Manta Add Free You can add your company right away without going through the registration process. Be sure to search their site for your business first so that you don’t create a duplicate listing.
Insider Pages Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Best of the Web Local Add $9.95/month Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Zip Local Add Free First, register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Brownbook.net Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
City Data Add Free This directory is only for brick & mortar businesses, no online businesses. You must have a real street address and business phone numbers. Make sure your profile is as in-depth as possible and provides unique information about your business. Do not simply copy a description you already used elsewhere (for example on your Web site). Add photos. Generic profiles, not providing interesting information about your business will be periodically purged without any warning.
Tupalo Add Free First, register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.

Lesson 4

Beware of OLD Information

Probably the biggest threat to your search engine ranking success is old information. Sure, by now everyone knows the key to high rankings is links! But there are an unbelievable number of places to get bad information about link building.

For instance, many seemingly reputable sources fail to report that some inbound links help more than others, while others can actually damage your ranking efforts. And if you think the key is reciprocal links (two sites that agree to trade links) then think again.

The sun has long since set on virtually every reciprocal link strategy. In fact, some reciprocal links are like two gunfighters who pull the trigger simultaneously only to succeed in wounding the ranking efforts of each other.

In the meantime, you should know that…
Building high-quality incoming links is the single most effective strategy for boosting your site’s search engine rankings.

Although it may also be the most challenging, it is clearly the most rewarding in terms of ranking well for your chosen keywords and for staying ranked. The challenge for most sites is to accumulate enough incoming links to dominate their niche
without tripping any one of the many spam filters that trigger the ranking penalties. The problem is that a lot of search engine optimization (SEO) firms and SEO educational websites are still recommending outdated, potentially damaging SEO tactics.

Warning: many of the standard link building strategies that once formed the backbone of an SEO campaigns are no longer effective. In fact, some of them are actually detrimental to your Web site’s ranking efforts. So, pay very close attention to this lesson.

Linking Basics

Anchor Text
A typical link looks something like this: Search Engine Optimization.

Here is the HTML code used to generate the above link within a Web page:

<a href=”http://www.searchenginenews.com/”>Search Engine Optimization</a>
|—————————URL——————————-| |————-Anchor text————-|

The most important parts are:

The URL (Uniform Resource Locator): This is the web address of the Web page being linked to.
Anchor text: This is the visible text of the link (in this case it’s: Search Engine Optimization).

As we’ve mentioned several times already, getting your keywords into the visible, aka anchor, text of the links that point to your Web pages is one of the most important rank-boosting strategies you can employ.

Inbound & Outbound Links

You’ll often hear reference to inbound and outbound links. An inbound or incoming link is a link that points to one of your Web pages. An outbound link is a link *on* one of your Web pages that points to someone else’s Web page. To illustrate, let’s say that A and B represent two Web pages hosted on different sites.

In the diagram above, Web page A links to Web page B. Thus, page A has an outbound link to page B, and page B has an inbound link from page A. We can also say there’s a one-way link between pages A and B.

Reciprocal Links

In some cases, Web site owners (or SEOs) may agree to “swap” links with each other. These are called reciprocal links; a term that refers to a situation where Web pages from different sites link to each other. Trading reciprocal links is hard work and their value has been increasingly diminished over the past few years. A few are ok, but to actively pursue them for the sake of just building a large quantity of links is a strategy of the past. Don’t waste a lot of your time on this; we’ll show you better ways to get better targeted, higher impact links.

Link Popularity: An Evolving Concept

Not long ago, search engines ranked pages based on the sheer number of links pointing to them. They called this Link Popularity. The reasoning was that good pages attracted lots of links. At that time, it was purely a numbers game—the more incoming links, the better. Unfortunately, this made search engines fairly easy to manipulate. It encouraged sites to acquire large numbers of links from low-quality sites or through aggressive reciprocal links campaigns. The goal, of course, was to manipulate the rankings.

Today, link popularity has evolved to place more emphasis on the linking site’s “importance” when viewed from the search engine point of view (SEPOV). Pure popularity is still pretty good, provided you aren’t trying to trick the engines, but getting links from important pages is even better than just getting lots of links. So, ideally you want the search engines to think YOUR pages are important.

Here’s how to do that:

  1. Get some links.
  2. Get a few more links from important pages.
  3. Use both 1 and 2 to attract more links.

As the cycle repeats itself, YOUR Web site becomes important from the SEPOV. Important websites, by definition, are either linked by a lot of unimportant pages, or else linked by other important Web sites and Web pages, or both. Getting links from important Web sites and Web pages is the quickest and most effective way to get your page highly ranked. In other words, pages that are linked from important pages are themselves considered to be important! And the more important links you have, the better.


To give you an idea of what we mean, take a look at the following diagram:

Page A has three incoming page links. Page B, meanwhile, has just one incoming page link.

What page would you rather get a link from, page A or page B?

If you answered A, you’d be right. Page A has more incoming links, and therefore is viewed as more important (assuming, of course, that all of those links are from high-quality pages). Even though pages C and D both have just one incoming link each, page C’s link is from a more important site. Therefore, page C is going to rank higher than page D.

Although this is an extremely simplified illustration of what takes place over billions of interlinked Web pages on the Internet, it captures the basic idea. Not all links are equal.
Just a few links, coming from important pages, can do far more good for your ranking efforts than a bushel of links from unimportant pages.

PageRank and the Google Toolbar

One of the measurements commonly used to gauge the importance of a page in the eyes of Google is known as PageRank. This measurement is displayed within the Google Toolbar once it’s installed in your browser. To download it, go to: http://www.google.com/toolbar although toolbar PageRank can be a useful measurement tool when gauging the importance of a site, it is often criticized for not being updated as frequently as some SEOs would like. The gap between what Google really thinks of a site, and the score being reported by toolbar PageRank, is a source of irritation for many SEOs.

Suffice it to say that, regardless of the spirited discussion that revolves around Google’s toolbar PageRank, the following facts prevail.

  1. Once installed, the Google toolbar will produce a PageRank (aka, an importance) score for every page you visit. For instance, Yahoo scores a very high PR=9, while WebMD scores an also high PR=8. This means that Google sees Yahoo as a little bit more “important” page than WebMD. Therefore, an incoming link from Yahoo would carry a little more weight than an incoming link from WebMD.
  2. Although PageRank is an importance score, it should not be confused with, or misconstrued as, a ranking score. It is entirely possible for a low PageRank page to score high on a specific keyword search if that search is more relevant to the page with the lower PageRank.
  3. Remember, you want to be linked-to by important pages. That being the case, PageRank remains the best insight you have into ‘who’s who’ from Google’s point of view.
  4. Here’s a warning, though. If a page is showing a PageRank=0, or the PageRank bar is ghosted out, then you don’t want a link from that site. It could mean that Google doesn’t trust that site — perhaps they’ve been caught spamming Google’s index. However, it’s also possible that Google simply doesn’t know about that Web page or Web site, or the ToolBar value may be incorrect If the PageRank appears to be 0 or ghosted out, you may want to check the page again later. Either way, having a link from them, or linking to them, will NOT help your ranking efforts and could possibly hurt them.
  5. The graphic below shows what PageRank looks like on your browser interface.
  6. As we alluded to earlier, Google isn’t updating their toolbar PageRank scores as frequently as you (and others) might like them to. The result is occasionally inaccurate PageRank scores and long waits before you see your own toolbar PageRank score respond to your optimization efforts.
  7. If you happen to notice your own toolbar PageRank score slipping, this can indicate that it’s time to rethink your optimization efforts. There is a distinct possibility that you’re doing something wrong, from Google’s POV, and the ranking penalty they’ve assessed is being reflected in your PR score. If, on the other hand, you have a new Web page and its PageRank isn’t going up as fast you’d expect, this could simply mean that Google hasn’t updated their toolbar PR scores lately.

Essential Strategies for Building & Structuring Inbound Links

One of the trickiest aspects of search engine optimization is the process of building high-quality incoming links. It’s also the single most effective strategy for improving your rankings. In raw simplicity, the more inbound links a Web page has, the more popular it is.

Search engines like popular pages.

The challenge for most sites is to accumulate enough incoming links to appear relevant to the engines without tripping any one of the many spam filters and penalties that are applied to sites that cheat. So, the secret to getting it right is to take the search engine’s point of view (SEPOV) when building your incoming link structure.

The key point to remember is that search engines like natural link structure they hate artificial link structure. Therefore, you must know the difference between

Natural & Artificial Links.

  • Natural links vary in anchor text while artificial links tend to be identical.
  • Natural links increase gradually as referral sites add links one by one over time; artificial links can sprout in great numbers all of a sudden.
  • Natural Links have anchor text that varies. Artificial links have anchor text that is suspiciously uniform or even identical to one another (they call this over-optimized).
  • Natural links are only occasionally reciprocal. Artificial links are often made up of an unnaturally high percentage of reciprocal links.
  • Natural links are not purchased or sold. Search Engines specifically hate it when links are bought or sold to influence search ranking and are continually on the hunt to penalize sites engaging in link buying if they can catch them.

Furthermore, sites with natural links do not link out to other sites that attempt to manipulate the search rankings. They refrain from linking to link farms, web rings, link brokers, mini-nets, or any network of sites whose primary purpose is to exchange links to increase link popularity.

As you might suspect, Google is getting very good at detecting these so-called link schemes. Google knows how to find these isolated nodes (i.e. Web page groups that link to each other but lack inbound links from outside trusted sites).

Be forewarned: if you participate in link schemes, you are courting disaster within the search rankings. Maybe not right away, but almost certainly, eventually. And Google, with all of their resources for storing information, remembers! …YOU, your site, your company name and so forth. Getting busted by Google *can* mean that everything you touch will never again rank well in a competitive search ever again. So, be very careful.

As previously mentioned, sites designed around natural links don’t usually swap links, so their outgoing links tend to point to pages that are known by the engine to be in good standing. Oftentimes these pages have been indexed for many years and may even be white-listed; a term used to distinguish trusted sites that never get penalized no matter what they do.

Natural links tend not to be reciprocal. Artificial links, however, rely heavily on link exchange tactics, suggesting that the sole purpose of the link is reciprocity—having little or nothing to do with adding value for the site visitor by way of providing worthwhile content.

Keeping these facts in mind, one should strive to build the most natural-looking incoming-link structure possible. From a search engine’s point of view (SEPOV), the best kinds of links are unrequested links. The engines are looking to bestow high rankings on only those pages that people voluntarily link-to due to their great content – not because some site owner, webmaster or SEO firm has managed to arranged a lot of link swapping or link buying.

Choose Your Links Wisely

While it’s true that almost any link from any legitimate Web site will add something of value to your Web page’s popularity, it’s best to get links from authoritative (aka, important) Web pages. Such link-getting Web pages are then also considered important. And, these so-called important pages can usually be identified as important via Google’s toolbar

PageRank scoring system.
The higher the Web site’s PageRank means the better the link. Directory examples would include sites like Yahoo and DMOZ. Others like PBS.org, National Geographic, CNN, or ZDNet would be outstanding authoritative site links regardless of topic since each has been assigned a PageRank of 9 (PR=9) or better on Google’s ten-point PageRank scale.

Get Links from Pages that Match Your Topic

Your next best option is to acquire links from pages that are trusted. Trusted pages are sites that have been indexed for a while and have already been assigned a Google PageRank—usually PR=5 or better.
It helps even more if these pages are on-topic—i.e. they match the topic of your page. Links from on-topic, trusted pages can give you a significant boost in rankings. However, if you do gather links from less than PR=5, then the on-topic factor becomes even more important.

Count the Number of Links on the Referring Page

Another point to remember is; the fewer the number of links on the referring page, the better. Ideally, the referring page would have only one link and it would be to your page. Of course, that’s rarely practical. But, having your link on a page that hosts a hundred other links is almost pointless because the value of your link will be divided by the number of links on the page—a condition we call link dilution.

While easier said than done, the ideal would be to get your incoming links from popular, on-topic, trusted sites scoring PR=6 or better but have very few outgoing links.
Now, short of the ideal, bear in mind that every link you can get is likely to help you somewhat—and if you can control how those links appear (in terms of incoming URL-format and anchor text); you’ll be in even better shape.

Use your Site Strength Indicator (SSI) Tool to check a page’s incoming links and overall Page Rank: http://www.searchenginenews.com/ssitool/

Maintain Format Consistency of Your Incoming Link URL’s

Even though each of the URLs below will land the site visitor on the same Web page, these are technically SIX different URLs!
…That’s right.

And, if those who link to you use six different URL formats to point visitors to your “home” page then your PageRank is being diluted by a factor of six – not good!
You simply must do everything in your power to standardize your incoming URL-format in order to consolidate your PageRank. This is often referred to as choosing the Canonical URL, which means it’s the preferred URL you want to have indexed in the search engines.

Doing so will produce the maximum relevancy-boost possible from your incoming links.

Get Your Keywords into Your Anchor Text

It’s very important to get your keywords into the text of the link (anchor text) that other sites are using to point visitors your way. True, this may be difficult with directories unless
the name of your company includes your keywords. Regardless, the boost in keyword relevancy is significant enough that it’s worthwhile to contact everyone who is using keyword poor anchor text to link to you. In such cases you should specify a more keyword-rich text link in your request that they upgrade your link. This will provide better value to their site visitors and better rankings for you.

If you happen to be selling model airplanes, then anchor text such as airplane models or model airplanes will be infinitely more valuable to your relevance efforts than anchor text simply saying click here. From the SEPOV, the former states the theme of your page while the latter gives the engine no clue whatsoever what your page is about.

A word of caution: it will look more natural from the SEPOV if the anchor-text links that are pointing at your site are NOT identical. Strive to maintain slight variations as would occur if the sites that are maintaining them were generating the anchor text independently. Of course, the nature of your business and the name of your company might dictate the range of options available to you. However, do everything in your power to ensure the text being used to point visitors and engines to your site looks natural from the SEPOV.

This strategy can make a HUGE difference. Generally speaking, from the SEPOV, it’s the anchor text that determines the theme (topic) of your Web page.

Go for Deep Links

Deep links are links that point to your Web pages within your site, your sub pages; and NOT to your home page. From the SEPOV, a lot of deep links indicate that a site has a lot of valuable content. On the other hand, if all of your incoming links point to your home page, then the engines think that your site is nothing more than a ‘one trick pony’ …offering very little content, and therefore, not an important site.

From the SEPOV, important sites have lots of deep links. In fact, many important sites have more links pointing toward their deep pages than to their home page. Therefore, you should strive to make at least 20% of your incoming links point to deep pages. And, if you can manage to get even 70-80% of your incoming links to point at deep pages, that’s even better in terms of making your site look important from the SEPOV.

Beware of the nofollow Attribute

See to it that your incoming links from off-site pages do not include the rel=”nofollow” attribute (often referred to as a tag) within the source code of the link; nofollow renders the link useless to your ranking efforts because Google doesn’t credit your page for having that incoming link.

When you are ready, you have a couple of Advanced SEO Tutorials available to you that explain and expound on this important topic in detail at:

Don’t Get Involved with Run-of-the-Site Links

Avoid run-of-the-site links. These are links where every page on another Web site links to your homepage. When you have, say, 1000 incoming links all originating from subpages within that one external Web site, it appears to Google that your link count is artificially inflated.
Link brokers are notorious for selling you run-of-site links. While you should avoid link brokers in the first place, you should be especially cautious that another Web site does not link to you from all of their pages. This will not only look artificial to Google, it will also look like you purchased the links—something that Google does not like (unless the links are nofollowed
which means they won’t help your rankings anyway). With the recent updates to Google’s algorithm you have to be more careful than ever to avoid link schemes.

Avoid Link Farms, Web Rings, & Site Networks

Focus your efforts on collecting all the links you can from authoritative sites. Avoid getting links from any site that may look remotely like a link farm or web ring (i.e. Web page groups that link to each other but lack inbound links from outside trusted sites). Linking back to these unnatural linking structures can get you penalized. So always be very careful about what Web pages you link back to. Even if you aren’t penalized, any benefit that would otherwise be derived from your incoming link will most likely be diminished due to Google’s spam-related opinion of the page.
And, in terms of building page relevancy, there is rarely, if ever, any benefit to linking back to sites that are insignificant, untrusted, or suspected of behaving badly in terms of SEO protocol. It can even hurt you.

Be Careful Who You Link To!

Getting links from off-topic, and perhaps even untrusted, sites may not be your first choice, but at least it won’t likely hurt your rankings; they might even help a little. However, beware of getting yourself into a link exchange relationship with these sites; and remember that you should NOT link back to them.

Currently, the rule is that:
incoming links won’t hurt you, but outgoing links to sites that behave badly can. In other words, if you’re swapping links, be sure you do so carefully because linking to a site that has been penalized for policy infractions (i.e. search engine spam) can cause your site to be penalized as well. To help you avoid such a scenario, here are four cautionary steps you should take before linking to another site:

1. Search for their domain name on Google and Bing. If they’re not listed on one or either of the engines, that’s a bad sign. Linking to them could get your site penalized and possibly banned. Besides, even if they aren’t a so-called “bad” site, linking to a site that the engines don’t know about won’t help you in the rankings anyway. However, if they are listed you can proceed to step two.

2. Determine who is already linking to them. Go to Google and type in link:www.thedomain.com to see if Google will display what sites are linking to them. Note that this tool is regarded as being very incomplete, but Google may display some of the links it is aware of. Smaller high quality sites often won’t return any listings, this is normal. Google heavily restricts what information it will display. The more incoming links they have, the more ranking boost you’ll receive if you can get a link from them. And, the more important the sites that are linking to them, the better the ranking boost for your own site if you can get a link. Their PageRank score is also one indicator of how important Google thinks the site is. Beware of linking to sites or pages with a PR=0 (zero). This could mean that Google has penalized them. Granted, this warning may not apply to very new sites but links from them won’t help you right away, anyway. Regardless, if a site has been around for
a while, and lacks any PageRank, then you should be wary of linking to it.

3. Avoid linking to sites with controversial topics. Examples of such sites would include gambling, adult, pharmacy, or loan/debt sites (unless you happen to be in one of these industries and the topic matches the content of your page).

Remember: You probably won’t be hurt by who links to you. However, your ranking efforts can definitely be hurt by who YOU link to.

Remember Your Primary Goal Profits!

Of course, our biggest assumption is that you’re optimizing your Web site with profits in mind. That being the case, you’ll want to always focus your efforts on strategies and relationships that will generate the most revenue relative to effort. Therefore, look first for link relationships that will produce traffic that fits the profile of your customer market. While it’s true that incoming links from just about any site provides a slight boost to your page popularity (leading to better search engine rankings), such links all too often fail to produce targeted traffic which is what you really should be looking for. This is one of the many reasons a link from a topic related site is immeasurably better than a link from an off-topic site.

The Best Place to Start Getting Links

You should begin by getting links from directories by submitting your site. You can find a complete list of the top directories here. The Yahoo Directory, (not
to be confused with the Yahoo search engine), for example charges a $299/yr review fee. And, considering that you get a link from what Google considers a very important site, it is worth it. This is especially true if your Web site is just getting started building its web presence. As a bonus, when you get listed in Yahoo’s paid directory, you’ll often get additional links coming from Yahoo’s international locations—which are all considered important links.

Once again, our Guide to the Major Web Directories provides an updated list of directory sites and citation resources that are helpful for Google Place Pages. You will find that some of them charge a fee but may very well be worth it in exchange for the trusted inbound link they provide. To add your site, look around on the main page of each of these directories for a link that says something like Add URL, Suggest URL, Add Your Site, or Suggest a Site. Follow that link to get details about exactly how to add your site to their directory.

By the way, to avoid unnecessary delays in getting listed, be sure to submit your site to the proper category within each directory. Submitting your site to the wrong category can result in a ridiculously long delay or simply not getting listed at all. Remember that the directory editors receive an enormous number of site submissions. So, save yourself some grief by carefully considering exactly which category your site belongs in before submitting.

A worthy directory is DMOZ

A listing in DMOZ is a worthwhile and trusted link if you can get listed. That’s because they supply the results for the Google Directory (not to be confused with Google’s main search engine). However, it can take up to a year to get your site approved. So, we recommend you submit it and forget it. Check back every few months. If you get in, great! If not, oh well! Don’t stress over it. The editors at DMOZ are volunteers and they’re swamped. But, if you’re lucky enough to get listed, be happy to know it’s free.

The reason for starting out getting listed in directories is because the submission process is relatively quick, the effort relatively easy, and because each directory link validates your site in the eyes of the search engines. You get an incoming link from a trusted site and another new source of targeted traffic. However, once you get listed in a few of the major directories, the relevancy boost levels off pretty fast. In other words, it isn’t important to get listed in all of them. Just get listed in a few and that will give you about as much ranking boost as you can expect to get.

Acquire On-topic Links

Links coming from pages that are topic-related to yours are considered good (i.e., relevant) links in the eyes of the search engines. If your site is about model airplanes, and another site is about airplane history, then each site shares a common topic, namely airplanes. Each of these sites would benefit from having a link from the other even though the links are reciprocal. This is what’s known as on-topic links.

Once again, directories can be great places to get links whenever the directory is on-topic. So, if your site is about model or radio control airplanes, then you should seek out directories that specialize in hobbies, scale models, airplanes, or radio operated toys and so forth. For a list of topic-specific directories, visit the Internet Search Engine Database at: http://www.isedb.com/html/Web_Directories/Specialty_Directories/

By the way, when getting listed in topic-specific directories, be sure they provide a direct, static link to your site. In other words, you do NOT want a dynamic link—one that is processed or created on-the-fly by some tracking software the directory has running on their server. Although this is not a concern with the major directories because they all tend to use static links, many smaller or niche directories like to create their links dynamically. Although this will add to your traffic count, it does nothing to help your search engine ranking efforts. That’s because engines fail to see the connection between the dynamic link and your site’s actual URL.

Link outside the Box

Figuring out where to get your incoming links from is like solving a puzzle. It takes a little creativity while following known formulas and patterns. Ask yourself; who else has a site that might benefit from linking to me?
Suppliers you do business with or professional organizations you’re involved in might be willing to list you on their referrals page. Legal advisors, accountants, or financiers you do business with might also like to list you as a client or maybe showcase your business in their online portfolio. Your employees may have blogs or personal homepages that could link to you, and so forth.

Here are a few more ideas to help you spark that creativity…Many online business owners write articles about topics related to their sites. Then they offer to let other sites use them as content in exchange for a link back to the author’s site. You’re probably an expert in the business you’re in and therefore an authority on certain subjects that may lend themselves to interesting reading that becomes worthwhile information for a basket of ancillary products and services.
Whenever you can do so easily, we recommend that you swap links with a partner company that you closely do business with—whose services compliment your own. Look for business partnerships with other websites that are useful to your own customers and whose customers are useful to you. Look for compatible (but not competing) businesses, and then form a partnership where you link to each other actively through mutual promotion. Not only can this bring in new traffic and boost your PageRank, but you may also develop important business relationships this way.

Press releases can be a good way to gain relevant links to your company’s site. Again, be creative—chances are good there’s a number of reasons (product launches, staff additions, promotions, partnerships, new services, etc.) you can find to release news about your company to the press. The engines quickly pick up press releases and the links contained within them are typically trusted. They also tend to remain on the Web for a good long time.

A Pro’s Guide To Optimizing Press Releases for Today’s Online Markets another interesting way to promote your own site is to submit testimonials, along with a link to your site, about products you are really enthusiastic about. If the testimonial is well written,
the company will often post it on their site. Sometimes they will even link back to you.

One of the more under-utilized “secrets” for gaining incoming links is to participate in topic-related forums that allow a text link to your site within your forum signature. Look for topics that you are knowledgeable in and begin posting—asking and answering questions. Assuming that you make legitimate contributions, you’ll find that your participation will be a welcomed addition in spite of the plug for your site. Most forum software, but not all, uses nofollow tags on the links. When this is the case, they don’t tend to help rankings, but they can be a good source for traffic.

However, many forums (and blogs) are now providing dofollow links if your contribution is worthwhile. For how to tell the difference, see these two (previously mentioned) Advanced SEO Tutorials:

Bear in mind that whenever you’re successful in getting another Web site to switch their link from your competitor’s site to yours, you gain twice. Once for gaining a new link, a second time for reducing the incoming link count of your competitor.
If the link is an especially good one (authoritative site in good standing with great incoming links, few outbound links, and high PageRank), then discreetly PAY them if you have to. Offer them a better deal than the one they have (if any). Do whatever it takes to get those quality links! Convincing another site owner to switch a link from your competitor to you is one of the single most productive link-building tactics you can use.

By using your imagination and dovetailing the nuances of your own business into the mix, you’ll no doubt discover an abundance of opportunities for gaining legitimate incoming links.

The Problem with Reciprocal Links

Our best advice when speaking of reciprocal links as a link-building tactic is to be conservative because, when done badly in the eyes of Google, it is viewed as an artificial linking pattern—something that search engines are getting increasingly sophisticated at detecting.
When you think about it, it makes sense that having a high percentage of reciprocal links would look like an artificial linking pattern because natural links are not typically reciprocal. If Yahoo lists a site in their directory, that site doesn’t routinely link back to Yahoo; that’s one of the reasons a link coming from Yahoo looks “natural” to Google.

Of course there are plenty of exceptions, but, regardless, the engines are looking for pages that rank well due to popularity based on content—and they want to avoid sites where it appears the site owners (or hired SEOs) have put a lot of effort into swapping links.

So, look at your incoming links from the search engine’s point of view (SEPOV). If CNN runs an article about how great your company is, and your company’s site links back to the CNN article, does that look normal from the SEPOV? …sure it does. Besides, CNN is an authoritative, important site. That reciprocal link looks like part of a natural link structure from the SEPOV. And, your site’s page can expect a substantial boost in ranking.

On the other hand, if your site (with its PageRank=4 or 5) is linked by Joe Blow’s homepage with a PR=1, 2, or 3 and you link back to Joe’s page, you shouldn’t expect much, if any, boost in your rankings. In fact, it’s entirely possible the two links are discounting each other based on an assumed link exchange arrangement that looks contrived because neither page is “authoritative” from the SEPOV.

Now, if you had, say, 50 similar link arrangements, and the links were all on-topic, and none of the pages involved had tripped the spam filters, then your page should get a reasonable boost in rankings. Still, you’d fare better simply by getting a single link from an authoritative site like CNN, Yahoo Directory, DMOZ, ZDNet, and so forth.

Evaluating the Quality of a Link

Much of effective link building focuses on topical relevance. In other words, pages linking to each other should, ideally, cover similar subjects. When the two pages are related in
subject or topic, the link will look more relevant from the SEPOV. This rule applies both to the sites that are linking to you and to the sites you’re linking to.

As ranking algorithms become increasingly advanced, search engines are evaluating Web sites in terms of neighborhoods of related sites. By linking to, or being linked from, an unrelated
site, you’re venturing outside of your neighborhood. Sometimes search engines view this suspiciously unless, of course, the page that is linking to you is considered “important.” Otherwise, the most valuable links come from Web pages that feature content related to your site. Links to or from off-topic pages are less useful, and, in some cases, too many of these off-topic links can even be harmful to your site’s search engine ranking efforts.

Stated another way, you should find your linking neighborhood and live there. The more sophisticated the search engines become, the more important it is to get links from important sites within your niche. This trend is likely to stick around for the foreseeable future. Acquiring on-topic links is key to establishing your site as an important destination from a SEPOV.

And, although it’s been clearly established that links from high-ranking, on-topic Web pages have become the gold standard of link building, there are other factors to consider when judging the quality of a link. For instance, traffic to the linking page is an important factor when you remember that links are not just for improving your search rank, they’re also an important source of targeted traffic.

The more people who visit a page that links to your site, the more traffic your site is likely to receive. If people like what they see, this can result in more links, more traffic and, ultimately,
more sales.

You should also bear in mind that Google and others are putting a lot of effort into tracking people’s web browsing habits. Our research indicates that highly trafficked sites are receiving a
preferred ranking status.

One way to get a rough idea of a page’s traffic is the Alexa service. On the Alexa siteinfo page, you can enter your target site (the page that you may want a link from), into the Site Lookup field and get a snapshot of their Traffic Rank, Pageviews, and so forth as seen in the screenshot below.

Bear in mind that Alexa can only track the Web browsing habits of people who have the Alexa toolbar installed. Since Alexa tends to attract a fairly tech-savvy audience, the Alexa numbers
probably don’t represent an entirely accurate picture of the relative traffic to a site. Regardless, it’s a useful tool to begin gaining a comparative idea of a site’s traffic numbers.

Accessibility to Search Engines
Obviously, Web pages on external Web sites are good link-prospects only if they are already known to the search engines. If your link-prospect’s site is architecturally flawed—and
therefore inaccessible to the search engine, then their pages may not be indexed. In such cases, a link would be absolutely worthless to you; at least in terms of improving your search
engine positioning.

To see if a search engine has indexed a Web page, simply select a unique phrase from that page and enter it (“surrounded by quotes”) into the search field. The quotes say, “Look for this exact phrase.” If the engine has indexed the page, then it will most definitely show up for that search.

Assuming you find the page is indexed, your next step is to see how often it is crawled by that search engine. Google has a little link that says Cached next to each of its search results.
Clicking that cache link will show you the date Google last spidered that page.

The more recent the cached date, the better. A recent cached date indicates that Google crawls this page regularly. As such, it also means that you’ll get credit for that link sooner.

If your link is already on the page, check to see that your link is in the cached version of the page. If not, something is amiss, and it’s likely the page employs some trick to prevent search
engines from seeing its outgoing links. Next, make sure that link is a direct link to your Web site. If the link is a JavaScript link, or a redirect, or a nofollow link, then it’s of little value to you, ranking-wise. Usually, placing your mouse over the link will cause the destination of the link to appear in the bar at the bottom of your browser (i.e., the status bar), but there are ways to manipulate this feature so it’s always best to check your link within the page’s HTML source code.

Let’s say that your site is called www.your-site.com and you’re being linked from a site called www.their-site.com.

A direct link in the HTML source code would look like:
<a href=”http://www.your-site.com/”>Click here!</a>

A Javascript link in the HTML source code would look like:
<a href=”javascript:void(0)” onclick=”window.open(‘http://www.your-site.com/’)”
onmouseover=”status=’http://www.your-site.com/’; return true;”
onmouseout=”status=defaultStatus; return true;””>Click here!</a>

A redirected link in the HTML source code would look like:
<a href=”http://www.their-site.com/redir.php?r=http://www.your-site.com/”>Click

And a nofollow in the HTML source code link looks like:
<a href=”http://www.your-site.com/” rel=”nofollow”>Click here!</a>

Incidentally, click here is horrible anchor text. Ideally you’ll want your keywords to appear in the visible portion of any text link pointing to your site. Getting your keywords into the anchor text of your incoming links is one of the most powerful page ranking strategies available to you. Besides, it’s unlikely you’ll want your pages found for the keywords click here.

However, be advised that a small number of click here links isn’t terrible. Having all identical incoming link anchor text indicates to a search engine the possibility of artificial link manipulation, which could result in a ranking penalty for your site.

Since your objective should be to achieve a natural looking incoming link structure, it’s a good idea to mix up your incoming link anchor text a bit. A few sporadically placed click here links make your incoming link structure appear more natural and diverse than a network of 100% keyword-rich incoming anchor text links.

By the way: The powerful effect of anchor text can be dramatically demonstrated by searching Google for “click here.” You’ll see the Adobe Reader download page grabs the top spot for that search term, even though the words click here aren’t found anywhere on the page itself.

Instead, Adobe’s page ranks #1 purely on the power of the anchor text within the incoming links that point to their page.
Incoming and Outgoing Links – Remember what we said about keeping your pages in their own topically relevant neighborhood?
The same advice applies to the pages that are linking to you. Check the incoming links of those potential link partners, as well as the other pages they are linking to. The more the page stays in its own neighborhood, the more valuable a link from that page will be to you.

And, of course, the fewer outbound links the page has, the better. That means more of the page’s link equity is focused on your inbound link rather than being diluted by having links to several other pages. In other words, a best case scenario is for the linking page to have only one outgoing link, the link to you. That way your page would be getting all of the link equity available from that link. Seldom, however, can you get this best case scenario; so go for a link on a page with the fewest outbound links possible.

Buying Links

Occasionally we suggest that you consider buying links. While this can be a very good way to quickly acquire relevant links, there are some guidelines that you must follow or else you risk hurting your rankings and possibly being penalized.

  • 1. Avoid buying links from known link brokers. Google discounts or negates purchased links whenever they know they are purchased. Therefore, unless you are buying the links strictly for traffic, don’t. Even so, if the link broker is on Google’s radar, having a link from them could result in your site being penalized in the rankings.
  • 2. Whenever buying links for the purpose of ranking better, be discreet. Avoid link brokers or anyone else who is known to be part of a link network. Instead, you should make a private arrangement and keep the purchase details confidential. If Google believes the link is natural, then you are likely to get a rank boosting link-credit for it.

When discreetly purchasing a link from another Web site, you should always ask to see the traffic stats for the page your link will be on. The more traffic, the better the link from both a ranking and a traffic generation perspective.

  • Location of the link. Ideally, you’ll want your link worked into the content of the Web page. This gives your link the opportunity to be surrounded by lots of relevant keywords (a concept known as keyword proximity) while also increasing the likelihood that someone reading the page will see and click your link.

How to Buy Links without Getting Penalized by Google

It is possible to boost your link popularity by purchasing links but you have to be smart about it. As we’ve already mentioned, you should avoid buying links from known link brokers. Such brokers typically sell links from sites whose topic is unrelated to yours, and these links tend to be placed in a page template that results in having a link from every page on that particular site. As you now know, these are called run-of-site links and they are assumed by the search engines to be purchased links.

It’s actually much better to get a single link from the homepage of a site than it is to get 100 links from every single subpage of a site. To accomplish this, it’s best to contact the owner of the site directly and discreetly. Offer to pay them to place your link on their homepage or on one of their high-traffic subpages. Remember, you’re not just paying for the link popularity; you’re also paying for the traffic the link will send you.

Don’t pay for links on pages with a lower than PageRank=3. Lots of links from unimportant Web pages won’t necessarily hurt you, but they’re not going to provide enough ranking boost to justify paying for them. One PR=7 link (especially if it’s on-topic) is worth dozens (probably hundreds) of PR=1 or 2 links. Links from low PR sites are rarely worth purchasing.

If your link is relegated to a sidebar, try to get it placed where it’s likely to be seen. The Google AdSense optimization page describes the locations on a page where a link is most likely to be clicked. Google says…


“Certain locations tend to be more successful than others.” This heat map illustrates these ideal placements on a sample page layout. The colors fade from dark orange (strongest performance) to light yellow (weakest performance).

“All other things being equal, ad placements above the fold tend to perform better than those below the fold. Ads placed near rich content and navigational aids usually do well because users are focused on those areas of a page.” To clarify, above the fold means the part of a Web page that is visible without the reader having to scroll down.

By the way, you’ll have to keep those paid links active for at least 3 or 4 months to really see significant improvement since most search engine optimization efforts require about 6 months before they begin to bear fruit. Obviously, this requires patience. But it’s worth it because once you get to the top you tend to stay there.

The type of page your link is on also plays an important role. A link on a so-called links page, alongside a hundred other links, is pretty close to useless. You may get some mileage out of it, but it would pale in comparison to most other types of links. A much more preferable link would be from an article that mentions your site and then links to you from within the article body.

Buying Ads in Ezines and Newsletters to Build Links

Perhaps the best kinds of links to buy are newsletter links. Many sites will send out a monthly or weekly newsletter to subscribers. These newsletters are typically archived on the site that publishes them. This means that, not only do you get the targeted traffic from the original publishing of the newsletter, but you also get a permanent link from the archived version of the newsletter; a link that tends to improve with age.

There is strong evidence to indicate that Google puts a time filter on new links, in order to diminish the overall boost people expect to see from purchasing links. This means that you may not receive the full effect of a link until it has been online for several months.
If you were to buy an expensive, high-PageRank link on the homepage of a popular Web site, you may have to continue paying to keep that link up for several months before you see any kind of ranking boost. This is why purchasing links in newsletters is good, since such links tend to get archived and hang around without requiring payment to keep them online. As those links mature, they become even more valuable.

Article Links and the Value of a Good Writer

One of the most powerful link-building strategies that search engines love involves content articles.

The strategy is twofold.
1. Write articles with valuable content for your own site that others will link to.
2. Write articles with valuable content for other people’s site with the agreement that the article links back to your site. And, if you lack writing talent, then you simply must hire a good writer. Sounds simple, doesn’t it. And it is. Providing great content is clearly the most effective part of a search engine optimization campaign. And, every online business needs to have at least one good writer. And we’re not talking about hiring someone on the cheap from Elance.com to write boring “filler” articles for $5 a pop about stuff everyone already knows.

And we’re definitely NOT talking about purchasing some automatic article-generating software to crank out reams of useless and meaningless “content.”

We’re talking about hiring someone who understands your industry and has a flair for creating interesting, useful information that people want to read and link to. Someone who can create articles that build a buzz! …buzz-worthy articles that can be printed on your own site to attract links and are syndicated out to other sites in exchange for a link back to you. Someone who can build a blog that generates a passionate audience of regular readers who can be funneled directly into your sales process. You need someone who can establish your business’s identity in the forums and user groups where your customers are congregating. In short, someone who is contributing real value to the global conversation taking place on the Internet and who is strongly identified as the online face of your business.

When done right, this ‘social media’ approach to building your company’s online presence is so impressively effective that most people who are doing it won’t tell you about it. They don’t want to lose their market advantage! 😉

Writing Your Way to a Higher Search Rank

Plan for ‘good writing’ to become an essential part of your online business plan—just as important (maybe more?) as your webmaster or your bookkeeper. The first way you can utilize good writing to boost your search engine rank is to produce articles that can be featured on your site. Leveraging your Web site and content to provide information of value, like articles about your industry, how-to articles and tutorials, or interviews with prominent industry figures, will increase your site strength. Find a problem plaguing your customers or other businesses in your industry, solve that problem, and then write about your solution. All of these can make excellent articles, provided they’re interesting and useful to your audience (i.e. your current and potential customers, and businesses in your own or related fields).

Good articles are link magnets. As an SEO firm, here’s the question we hear most often from new clients: Why isn’t my site ranking higher?

In almost every case, the problem is that the site doesn’t have enough incoming links. And the reason the site doesn’t have more incoming links is because the site is boring and doesn’t provide anything worth linking to. Well-written articles go a long way to solving this problem.

Syndicating Your Articles to Build Links

While featuring your articles on your site is a great way to attract incoming links, another option is to distribute those articles to other sites in exchange for a link back to your own. However you should be aware there are right and wrong ways to approach this strategy. Done correctly it is reasonable to expect to build links to the exact pages you want and with the perfect anchor text placed into your links back to your site.

However, the problems to avoid include: Low quality links, duplicate content problems, and drawing unwanted attention to your site’s optimization efforts—something that Google frowns upon as an artificial link-building tactic.

When you are ready for this tutorial, it will teach you how to find the best article distribution partners, an easy way to produce lots of high quality articles, and how to use Article Directories to your advantage. Be sure to carefully study the tutorial before you embark on Article Syndication as a method for building links.

Using Forums to Reach Customers and Build Links

Another way to utilize your ‘hired writer’ is to make them the public face of your company in Internet forums. This is a great way to increase your company’s exposure and to interact directly with potential customers. Like writing articles, this can also be quite labor-intensive. In order for it to pay dividends, you (or your writer) will need to become established as an authority (preferably a moderator) in the forum before you can expect to have much influence.

You’ll get a lot more mileage out of forums by contributing helpful, informed advice than you will by promoting yourself or your business. Once you establish credibility, you’ll find that forums can be a fairly reliable source of targeted traffic for your business.
While it’s important to establish yourself in the most widely-read forums that relate to your industry, it’s also nice if the forum lets you place within your signature line a link back to your company’s site. Not only does this make it easier for people to find your site, it also helps build a small amount of link equity along the way.

Even smaller forums often rank very well for many keyword phrases if the topic thread generates conversation in the form of posts by users. Google absolutely loves to see that participation and even goes so far to note the number of posts and authors the threads have.
Don’t overlook relevant forums as a way to reach your customer base, just make sure to observe the forum rules and culture before you post.

Blogging Your Way to Top Rankings

While it is true that blogs (aka, web logs) are easier to get highly ranked than just about any other type of Web site, it’s also true that keeping up with the demand for new and fresh information is far from easy! Blogs are a huge amount of work and anyone who underestimates the nearly superhuman effort it takes to maintain a blog AND run a business is in for a rude surprise.

It’s just another reason why it’s a good idea to have a dedicated writer on staff. In addition to writing articles that build links to your site, that person can be writing and promoting your company’s blog while you’re busy running your business.

Blog Stands For Better Listings on Google

Unless you’ve been hiding under a rock, you know that blogs have clearly changed the way people communicate and do business online. Companies global and local with every kind of product and service are growing their business with the speed and reach of the Internet by using blogs for these reasons:

  • Blogs attract customers who are searching for information.
  • Blogs invite customers to learn a company’s services, products, and expertise.
  • Blogs introduce customers to people who work for the company.
  • Blogs start customer conversations.

Savvy business leaders are using blogs to talk to customers and clients and to turn those conversations into long term business relationships. Beyond the profitable people connections, blogs are the new location! location! location!.

The constantly changing, relevant content on your blog is the perfect formula for the ultimate high-ranking page in Google or another search engine—here’s why:

  • Blogs that are updated frequently are indexed often by search engines.
  • Highly indexed blogs increase visibility and sales lead generation.
  • Blogs are content-rich and relevant, which attract in-bound links and raise page rank.
  • All of those frequently updated pages give search engines more content to index, serve to searchers, and hang ads on.
  • Blogs can attract intensely loyal readership, which is social proof of its business value.
  • While a conventional Web site might have a limited number of pages, great business blogs might have hundreds. Blogs with hundreds of pages get hundreds of listings in the search engines.
  • Describe the questions and experiences which are meaningful for your audience to entice them to read your posts.
  • Discuss current industry issues. This attracts even more visitors who will look to your site for information in a frank expert fashion and establishes a trust relationship.
  • Feature interviews with leaders and customers who work in your field. These ‘experts’ will bring their friends and audiences to your blog.
  • Share ideas, create easy lists and cool widgets to help make your customer’s lives easier. This creates a community of loyal readers.

Use the power of consistent and regular updates.
Using the Power of RSS to Build Subscribers

Make sure your site feed is clearly visible. If you don’t have a subscribe link clearly visible on your blog – get one! The easier you can make it for readers to subscribe to your blog, the more subscribers you’ll get. (Also see: Help, I don’t know RSS! What do I need to know to set-up my own feeds?)

Free Blog Software and Hosting
The free hosted sites work if your purpose is private thinking, a short-term launch site or early practice before you get serious about blogging. Free services are fabulous for journals and hobby blogs. However they offer serious limitations for a business blog. The two most popular free services are Blogger and WordPress.com

The benefits:


  • Blogger blogs tend to get indexed almost immediately.
  • It requires no technical expertise. You can be up and blogging in ten minutes or less.
  • You can even use your own URL, such as http://yourname.com which is good practice.
  • Google also allows people to use Google Adsense and their other analytical tools on Blogger.


The downfalls:

  • The number of themes are limited and soon you’ll find that many others are using the same theme that you choose.
  • Finding service help when the system or the blog is down can be a problem.
  • Spammers have found the ease of using a Blogger blog a boon to their business practices. So, a large percent of Blogger blogs are simply set up to attract traffic to highly popular keywords. They also republish un-permissioned content from legitimate blogs
  • Because of the spamming history visitors tend to cast a skeptic’s shadow over businesses that use blogger software straight “out of the box.”

Submit to Blog Directories:
Embrace Email
Start a Post Ideas Log
Embrace Guest Blogging
Add Social Proof
Rotate Content Formats
Join a Mastermind Group:
There is nothing wrong with getting help with your blog from the Pros. Finding an environment where like-minded blog owners come together to collaborate, learn and network in a members-only environment is a great resource for new and experienced blog owners alike. One of the most popular is ProBlogger.com share and exchange tips and advice. Just as you would offline with your local Chamber of Commerce seeking out like-minded blogging professionals within your niche is a surefire way to jumpstart your blog.

Distributing Press Releases to Build Links

Press releases, once they are initially syndicated by Google News and other news agencies, can send you a quick burst of traffic; however, after that, these same press releases will often be archived in a number if locations. They typically remain online and indexed in the search
engines for years. This, in turn, provides you with mature links that continue to contribute to your Web site’s overall web presence and,
thusly, your Web page search ranking.

Before you begin doing online Press Releases, be sure to study this Advanced SEO

Buying Abandoned Web Sites to Gain Links Quickly

It’s been estimated that nearly half of all small businesses started in the U.S. fail within the first four years. While in the brick-and-mortar world this means closing up shop and letting someone else use the real estate, in the online world those failed business can appear to be hanging around for years while the site owner keeps them online in hopes that business will improve.

Many such businesses could be ripe for purchase at rock-bottom prices. Sure, their (lack-of) profit model might not be enticing, but what about the links they’ve already accumulated? In terms of initial cash outlay it might be a bit more expensive than other link-building tactics,
but when measured in time savings it could be a bargain.

One of the best ways to find those abandoned sites is to do a search in Google for outdated copyrights, such as: “copyright 2009” + your keywords Nothing says a site has given up trying like an out-of-date copyright footer at the bottom of each page. (Hint: to give your customers the impression you’re up-to-date, current, and still in business, keep your copyright footers updated to the current year.) Obviously, you can also search for “copyright 2010”, “copyright 2006”, (include the quotes for exact match) or any other year that’s not current.

Other good searches to find abandoned or underperforming sites that can be purchased for cheap include “temporarily down for maintenance” or “under construction”. Just make sure the domain name hasn’t already expired on the site you’re thinking about purchasing.
Once a domain expires, Google wipes the slate clean, thereby reducing to zero the value of any incoming links or PageRank the site may have acquired.

Why Would You Want to Buy an Existing Web Site?

Established Links – Why rent links when you can own the entire site? Older domains tend to have a much higher number of quality inbound links which were developed gradually over time. As we’ve discussed in previous reports, the older the links pointing at your site are – the more powerful and more trusted they’ll be seen by the search engine.

Redirect PageRank – Instead of buying a site with the intention of making it your primary site, another highly-effective strategy is to buy a site with the intention of pointing some of its links at your primary site. In some cases you may want to 301 redirect the entire site.

An Investment – If you continue to cultivate and improve the site, you can often resell it for far more than you paid. Other times, buying a site might simply be the only way you can get that great domain name you really want to use for your new venture (and starting that venture out with a site that’s already got great links doesn’t hurt either).

Where to Find the Best Site-Buying Opportunities

Once you have decided to go on your quest to acquire that perfect site, you need to know where to start looking. Luckily we have compiled a list of some of the best places to buy existing Web sites:

There really are countless forums devoted entirely to buying and selling sites

Here is a list of 7 variables we highly recommend that you consider when deciding on the value of a site.

  1. Cost: How much would it cost to build an equivalent site?
  2. Reoccurring costs: How much does the site cost to run for hosting, speciality programming and maintenance, software licenses, etc. This information is very important to know.
  3. Time: How much time would it take you to build such a site?
  4. Existing Income: Is the site making money? Is there an existing business model? (Find out their business’s expenses, such as software licensing, hosting, advertising {PPC, etc.} and maintenance.)
  5. Established Links: How many links does the site have? Where are they from? Is the site owner in control of any of those inbound links?

    Don’t get caught in a trap where you pay top dollar for a well-linked, high-PageRank site, only to find out the previous owner controls those links and doesn’t intend to keep them pointed at the site after the sale.

  6. Traffic Count: What are the site’s true traffic stats? It’s not at all inappropriate to inquire about page views, unique users, rate of growth over time, and be sure to see some server logs to back those numbers up.
  7. Visitor Loyalty: Are the site’s users going to bail when the site owner sells? If you’re looking at a blog, or a forum be very aware that the person hosting the site is likely the key reason the users are there, and this also extends to other members of the site that people are following. It is very likely that when the site owner and other key members leave, many of their fans will follow them. If you want to attempt purchasing a site like this, do make arrangements for the previous “personalities” to stick around for a reasonable amount of time after ownership changes, and you might want to consider a non-compete agreement.
  8. Keyword Strength: What keywords does the site rank for? Perhaps more importantly – what terms could it rank for with some development work?
  9. Technology: How up to date is the site? Is it based on HTML code from the 1990’s, or is the site up to date using CSS design and modern practices? Make sure to factor in what it will cost to bring the site up to date with design, standards, and software.
  10. Resale Value: If you were to turn the site around and sell it, what could you potentially get for the domain?

Don’t overlook the value of a directory listing in DMOZ and Yahoo – there are domains out there that are grandfathered in at Yahoo (and other directories) that do not pay listing fees.

How to Analyze a Domain

Once you have a list of sites that have potential, it’s prudent to spend time analyzing the site to make sure it’s not banned or has a penalty.

  • Does the site rank for its domain name, text in the title, and different chunks of text found on its pages? Penalized sites will often have trouble ranking for easy search phrases such as the site name.
  • Look at the history of the site in the Wayback Machine to get an idea of what kind of site it has been over time. You might discover that the site was previously something vastly different than it is now.

Negotiating the Sale

If you’re buying a site through a site-buying forum, then the process is very clearly defined. However, things can be a little murkier if you’ve found a site in the search engines and are contacting the owner directly via their contact form or their whois data.

Here are a few useful guidelines to follow when negotiating the sale of a site:

  • Don’t make your initial offer more than 75% of the max you’re willing to go. Leave yourself some room to negotiate.
  • Don’t come off as overly sophisticated about the process. If the owner gets the impression you’re an experienced buyer of Web sites, they will likely hold out for a higher priced offer.
  • Always use a reliable escrow company (such as Escrow.com) to handle the transfer of funds. This protects both you and the seller from fraud. For very expensive purchases, consider having your attorney draw up a purchase contract.

How to Safely Transfer a Site to Your Ownership Without Tipping Google Off

Search engines, specifically Google, have publicly stated their desire to prevent marketers from buying domains and taking over their old links for the purpose of gaining PageRank. It’s quite likely that they monitor domain registration information and keep historic records on this data (Google is one of the largest DNS registrars around). As a certified domain name registrar, Google has access to databases of public whois information and can easily monitor changes within this data.

Google also watches for changes in the domain registration information. If they detect the site has new ownership, they may reset the site’s inbound link credit and PageRank. Google’s goal is to prevent people from capitalizing on a site’s existing incoming links and trust by simply purchasing that site. They’re much less likely to do this if the site continues on the same topic and provides the same information and resources that it was originally given credit (links) for.

If a domain expires, it is almost certain that search engines will wipe clean the PageRank and incoming link credit. Typically they will treat the domain name as a brand new one when it is reactivated. If the site is reactivated and no other changes are detected (content, hosting, ownership) then it’s possible that the site may retain its links.

Once you’ve found a domain to buy and have settled on terms with the buyer, here are the steps you’ll need to take in order to transfer ownership over to you:

  • First – you NEED access to the email account listed in the whois registration data for the admin contact. Once you have control of that email account you can approve moving the site to other registrars, switch it to new DNS servers at a new hosting company and modify the registrar owner contact information. Without having control of the administrative contact email address, you do NOT have working control of the Web site.
    If it’s not possible to get that email address transferred to your control (perhaps because it’s the previous owner’s personal email), then have the old owner change that admin email address over to an email account you do control. Ideally they’ll create an email address for you under that domain (such as admin@domain.com) and give that to you.Remember, you Do Not own the domain until you are listed as the email contact and have control of that email account. Plus…if you are planning on buying and selling sites in volume, you may want to use a different email address for this activity than the one you are using for your other sites.
  • Next, we do know that Google will not reset a site’s PageRank just for changing ISPs, therefore the first step you may want to take after you have access to the admin email account is to move the site to your own server. If you establish an email account such as admin@domain.com as mentioned above, all you have to do is update the DNS server information to the new hosting account you wish to use.
  • Finally, changing the registrant contact information. Here’s the point where things get a bit tricky. Ideally you want to avoid changing this information as it can tip off the search engines that the site has changed hands. If you decide to move the site to a new Web host, try to avoid changing the registrant contact information for several months. Move slowly through all these changes to avoid any yellow flags by the search engines.

How to Obtain Valuable Links from .edu and .gov Domains

While it’s tricky to conclusively prove, there is ample evidence that Google places high value on links coming from .gov and .edu top level domains (TLDs). Because these TLDs are only available to government sites and accredited educational institutions, links that originate from them are somewhat exclusive and difficult for the average person to obtain.

Therefore, these links are much harder to manipulate and that’s why they are trusted and valued more than links coming from non .gov and .edu domains.

In fact, unless you’re creative, it’s almost impossible to get links from these domains. However, thinking creatively opens up all kinds of possibilities. The .edu domains, especially, offer several opportunities.
First, do you have services you can donate to the school in exchange for a link back? Web design and SEO firms are especially suited in this regard, but many businesses can find some way to contribute either products or expertise to a school or their Web site.
Second, most schools give their students a small amount of web space in order to host that
student’s personal Web page. There are literally millions of these personal student Web pages
available and getting a link from any one of them can help your site in the search results. In
fact, if you attended college, you may find that your school will make one of these pages
available to you as an alumnus.
You could also consider offering a special student discount on your site. Students can receive
this discount if they link back to your site from their personal Web page hosted on an .edu
Beyond that, your potential for acquiring links from college students is limited only by your
imagination and promotional creativity. (Pizza for links anyone?)

Providing Tools & Resources to Attract Links

It is interesting to note that the homepage for blogging resource provider WordPress.org is a
PageRank=9! That’s largely because of the huge number of sites that use the WordPress
blogging software. You see, the resource cleverly embeds a ‘powered by WordPress’ link
built into its web interface. Multiply that out over millions of blogs and you get a recipe for a
very powerful incoming link structure that builds steadily and looks perfectly ‘natural’ to
Your business may not be able to duplicate the extraordinary success of WordPress, but their
example illustrates a powerful way to accumulate incoming links by building popular web-
based software and distributing it with an embedded backlink in the user interface.
The alternative is to build a tool that’s a popular resource and host it on your own site in order
to attract links.

Using your Affiliate Program as Link Magnet

Starting an affiliate program can also be an excellent way to build incoming links. As you
may know, many affiliate programs use special modified URLs to track which affiliate is
responsible for which sales. For example, Amazon has an enormous number of affiliates
linking to them with links that look like:


Obviously, each is an incoming link to Amazon. However, in this case, the link is going to a
subpage that Amazon probably doesn’t care about ranking highly. However, to get better
mileage from such an incoming link, you should consider redirecting the link to pass PageRank
and link equity to your site’s homepage.
When you are ready to implement this advanced SEO tactic, you can learn how it’s done by
studying these two very detailed Advanced SEO Tutorials:

Social Networking to Build Links

There’s an often-overlooked social networking aspect to link building. If you’re well-known
and have connections within either your industry or one that is closely related, then you
should use those connections to get links to your site. Hypothetically, if you had a site that
sold bodybuilding supplements and your uncle was Arnold Schwarzenegger, you might
consider asking him to link to your site from his own.
Of course this tongue-in-cheek example just illustrates the fact that who you are and who you
know can also play a significant role in link building. Therefore, remember to seize this
opportunity whenever you’re attending industry conferences or any type of networking
function that might lend itself to the link building efforts.

Discover Who’s Linking To Your Competitors

The smart marketer knows that competitors have already blazed the trail. In many cases,
finding pages that rank well for your keywords, and emulating your competitors’ link-
building strategies, is the place to start building your own links. In other words, your
competitors’ links could be your links too. And, sometimes, they could be your links instead!
By integrating the strategies outlined above, and offering content to sites that are linking to
your competitors, you’ll soon find your pages are also in the mainstream. In due time,
persistence tends to royally pay off. And, sooner or later, you’re likely to outrank the
competitors from whom you ‘borrowed’ your link-partner ideas. Later in the next lesson on
Competitive Analysis, we’ll talk more in detail about how you might implement this strategy.

Linkbaiting via Social Media: the fastest & safest way to acquire links in bunches

We’ve saved the best linking strategy for last. Linkbaiting is the ONE strategy that we have not yet addressed and is arguably the most effective at getting lots of links quickly. If there is a problem with linkbaiting it’s this: the links you get are rarely on-topic.
Nevertheless, Google likes these kinds of links—even when they are accumulated quickly! In a nutshell, this tactic involves creating content that is so compelling that average people feel they simply must tell others about it by linking to it via blogs, forums,
email, and so forth. You place content on your site that literally “baits” people into linking to it.

Think YouTube. Use humor, outrage, demonstration, and even news.

There are seemingly endless examples of YouTube linkbait strategies that have been immensely successful in acquiring links by the thousands!

  1. Provide tools and resources that are so valuable they excite people to tell others about them.
  2. How to Lists, Top Ten Lists, etc., that are so useful that people feel compelled to share.
  3. Writing about a controversial topic in such a way that is either so shocking or so supportive that people are either outraged or impressed enough that they simply can’t helpthemselves; that have to share by placing a link to your content.

The term most often applied to this style of incoming link generation is ‘viral marketing’ which we define as:
The online version of word-of-mouth. Viral Marketing relies on self-propagating or self-replicating methods to circulate within multiple networks aimed at trying to reach a large audience in a short amount of time. It is usually spread using email, blogs, and
other social networking or marketing channels.

To fully grasp the complete picture of how this versatile and powerful method for acquiring legitimate links quickly, you must study all four of these critically important Advanced SEO

Tutorials that are located inside the membership area of SearchEngineNews.com:
LinkBaiting: The #1 *Secret* Art Used By Top SEO’s to Gather Links In
How to Promote Your Link Bait – 10 strategies that work!
How to Create Content, Build Links and Increase Search Rankings by Marketing
with the Digg Effect
Pinterest is HUGE! Here’s How to Harness it’s SEO Power.

Finding the Right Link Balance

You’ll want to avoid letting your incoming link structure get too homogeneous. Incoming links from only one type of site or only to your homepage or all with identical anchor text are telltale clues that could cause a search engine to flag your site for an unnatural link structure. Ranking penalties could then follow.

You should strive to achieve a general 80/20 link balancing act:

That means;

  • 80% of your links come from topically relevant sources and 20% from unrelated or marginally related sources.
  • 80% or less of your incoming links going to your homepage with at least 20% (more is better) of your links going to subpages deeper within your site.
  • 80% of your links having your keywords in the anchor text, 20% percent with something less optimized like “click here” or your domain name as the anchor text;
  • 80% of your links are one-way links, 20% are reciprocal.

Of course, these numbers are only general guidelines. The point is that you don’t want your
site to appear overly optimized, so it’s important to balance your link ratios.

Is Having a Great Site Enough?

Some “experts” claim that having the best site for your category will be enough to attract all the links you’ll ever need. While keeping a straight face, they righteously profess that, time spent on building links is better spent on improving your Web site. At best, this is only partially true. The other side of the discussion states that; If people don’t know about your site, it doesn’t matter how good your content is.

And our years of experience has taught us, that’s a fact!

Clearly there are many, many great sites out there that provide top notch content but get very little traffic because of poor search engine positioning. People would likely link to them IF they knew about them. But, without links, neither the search engines nor people will find them in first place. And, thus, the links never happen. To paraphrase an old saying: it takes links to get links!

As your site improves its search rankings, you’ll sooner or later reach a tipping point where you’ll acquire links without even asking; simply because your site is more visible. Sites on the first page of the search results for any competitive keyword can often acquire large numbers of links without even trying. But, if you’re link-poor, it can be an extraordinary challenge to get a leg up without actively seeking links to get the link building process started.

Here’s Your Link Building Roadmap!

When you are ready to embark on your link building campaign, here is one of the most important Advanced SEO Tutorials for you to study and then implement. It literally maps out your Link Building Strategy!

Your 6-Step Link Building Task Planner
And, while this may be the shortest chapter in the book, the Advanced SEO Tutorial listed above may be the one that produces your most productive results. Be sure to study it and then implement the strategies step by step.

Final Review

Congratulations! You have just finished the largest and most important lesson of this course. In this lesson:
1. You have been warned to avoid OLD link building and SEO tactics in general because old information that teaches outdated SEO strategies and tactics can truly mess up your chances of ranking at the top of the search results; possibly forever! Beware.
2. You have been taught the Link Basics. The definitions of Anchor Text, URLs, Inbound, Outbound, and Reciprocal Links.
3. You’ve learned that Link Popularity is an evolving concept and how today’s search engines place more ranking-value on some incoming links over others and they base that value on Web page importance.
4. You’ve become familiar with Google’s imperfect, but widely used, Toolbar PageRank scoring system of rating the importance of Web pages.
5. You’ve learned the difference between natural and artificial link structure and why it’s important to keep your incoming links looking natural.
6. You’ve learned the importance of getting links from on-topic pages and choosing your links wisely.
7. Now you know the importance of link equity and getting your links on high traffic pages that do not already have a lot of outbound links.
8. You’ve been warned to avoid ‘run-of-site’ links.
9. You’ve been taught the importance of maintaining consistency in the format of your incoming link URL’s.
10. You’ve learned the importance of getting your keywords into the anchor text of your inbound links.
11. You now know the importance of getting deep links.
12. You’ve been introduced to the nofollow link tag and have been warned to look for it when evaluating the quality of a link.
13. You have been warned to avoid linking to link farms, web rings, & site networks; and generally, to be very careful who you link to.
14. You have been directed to the best place to start getting links; Directories like Yahoo, DMOZ, etc.
15. You’ve learned the importance of getting ‘on-topic’ links
16. You have been given a host of alternative, frequently overlooked yet effective potential incoming link sources.
17. You now know the problem with having too many reciprocal links and how to be smart about building these types of links.
18. You’ve learned how to evaluate the quality of a link; and why it’s important that you do so.
19. You have been taught the rules you must apply whenever you buy links.
20. You know a good writer is essential to building your company’s online presence.
21. You’ve learned how to write your way to higher listings with article marketing.
22. You now know about building links with Article Syndication.
23. You’ve learned how forums can be used to reach customers and build links.
24. You’ve been shown how a blog can be used for better search rankings.
25. You know that Press Releases can be used to build quality links.
26. You know you may be able to gain links quickly by buying abandoned Web sites.
27. You now know how to buy links without getting penalized by Google.
28. You’ve learned how Ezine and Newsletters Ads can be used to build incoming links.
29. You know the value of links coming from .edu and .gov domains and have great strategies for getting them.
30. You’ve learned that providing tools & resources are great ways to build links.
31. How your Affiliate Program can be used to attract links.
32. The often overlooked link building aspects of social networking.
33. The importance of knowing who is linking to your competitors and why that can help you get your own links from their list.
34. You’ve been taught the incredible strategy of LinkBaiting via Social Media and Viral Marketing; the fastest and safest way to acquire links in bunches.
35. You have learned the proper link balance and the dangers of having too many of any single type of link.
36. Why just having ‘good content’ isn’t the key to top rankings; the critical importance of adopting a proactive approach to link building.
37. And you have been directed to Your Link Building Roadmap—likely to be the most productive tutorial in the Advanced Section of this SEO course.


Lesson 3

All About Keywords

The secret to building a high-ranking Web site can be boiled down to three simple steps:

  • Targeted Keyword List
  • Search-Friendly
  • Get Links Site
  1. Targeted Keyword List: Assemble a smart list of relevant search words (aka, keywords) that your target audience is using to locate your products and services; and then strategically insert those keywords into the proper locations within your Web pages.
  2. Search-friendly Site: Build your site so that it is easy for search engines to locate and properly index.
  3. Get Links: Accumulate the right incoming links coming from the right places.

Regardless of what you may have heard, 95% of professional SEO (aka, search engine optimization) is really all about focusing on these three basic steps.

What are Keywords?

The singular term keyword is actually misleading. You’ll almost never be optimizing your Web pages for a single keyword because single keywords are typically too general. Single keywords are also highly competitive—in fact so competitive that it is unrealistic to expect that your Web pages can score at the top of the search results for a single keyword search. But, that’s ok because you don’t need to, nor do you especially want to. The search terms that convert best to sales are typically very specific key phrases comprised of two to five words. Although this is sometimes called a keyword phrase, it is most typically called a keyword.

For example, hotel is a keyword. But it would do you no good at all to score at the top of the search results for any single keyword like hotel. That’s because such generic keywords are far too general. When we search Google using the keyword, hotel, the search results give us a list of hotel directories featuring hotels located all over the world. This is what’s known as an untargeted search because the search results we get are not actually very useful.

On the other hand, for example, let’s say you own the Manago Hotel in Captain Cook Hawaii. Some of your target keywords would be: hotel Hawaii Captain Cook or hotel captain cook Hawaii—both of which reflect the location of the Manago Hotel situated in the little upcountry town of Captain Cook, on the Big Island of Hawaii, in the state of Hawaii.

Another keyword possibility could be, affordable accommodations captain cook. Yet another keyword possibility could be, big island affordable accommodations. Notice that, in each of these cases, our keyword is actually a keyphrase.

This is almost always the case.

So, get used to thinking of each of your unique keyphrases as a keyword. Using our example above, our “keywords” are actually four different keyphrases:

hotel Hawaii Captain Cook
hotel captain cook Hawaii
affordable accommodations captain cook
affordable accommodations captain cook

Of course, there are many more keyword (i.e., keyphrase) possibilities which we could potentially target, but you get the idea.

The Importance of Keywords

Keywords are the cornerstone of search engine optimization (SEO) and search engine marketing (SEM).
Every aspect of crafting a Web site-for-profit revolves delicately around carefully chosen and strategically placed keywords.

Behind the scenes of every top-ranking sales page is a company’s systematic campaign to win dominance in an escalating battle over specifically targeted keywords. The stakes can be high. Billions of $$ have already been earned and billions more are in the queue waiting to be tapped. Clearly, keywords are big business. There’s much to be gained by getting them right. Be systematic and select carefully! While the effort required can be great, the rewards for mastering the skill of keyword selection are substantial.

The key to keyword success is two simple items.

  1. How to find them, AND
  2. What to do with them once you have them!

We will get to more on that shortly but first some background on sales and money keywords.

Finding the Money Keywords That Trigger Sales

As you now know, keywords are the search words and phrases people use in their online searches. Be aware though, there are three different types of searchers.

  1. Academic Information Searchers
  2. Product or Service Research Searchers
  3. Buyers that already know exactly what they want and are searching to make a purchase right now!

To clarify, sometimes the searcher’s motives are purely academic, even scientific. For example, they may be looking for information on a medical condition or a geographic location for a school report, or perhaps even a political or science answer—something that does not involve a commercial transaction of any sort at any time, now, or in the future. We call this an academic search.

Another type of searcher is a person who is interested in making some sort of purchase at some time in the future; doing research in preparation for making a purchase decision at a later time—maybe 5 minutes, 5 hours, 5 days, or even 5 months later. We call this kind of search activity, research prior to making a purchase.

The searcher you really want is a buyer—a consumer who has completed their research and is ready to get involved now with your product or service. In other words, they are ready to pick up the phone and call to make an appointment or, perhaps, place an order for your product(s). They are ready to move forward with a decision that is likely to involve a commitment or purchase of some sort. They have done their research, however brief or long, and they know what they want to purchase.

Of course, a Web site is well advised to build Web pages and use keywords that are geared for all three instances. But, if your site is to successfully close the sale, you must realize that it’s only in the latter case—when a searcher becomes a buyer—that there’s any statistically significant chance for an online company to make a sale in an unbroken buying process that looks like..

So, while all three searcher-types involve keywords, only one type consistently converts to sales. That’s why today’s professional SEMs (search engine marketers) spend the extra effort necessary to identify the keywords that customers are using when they are ready to BUY their product or service.

Professional SEO’s know the difference. They know to focus their efforts on determining exactly which keywords people use to buy and which ones they use to research.

Get relevant, get tactical and get granular to get sales.

So, your job is to reverse engineer the keyword buying process for your market, making sure your pages score well in the keyword searches your customers are using to buy. Once you’ve covered that base, then you can build your informational funnel pages to help snag even more buyers.

How to Find All the Right Keywords

Start by making a list of every possible search term that people might use when searching for whatever you’re selling. There’s a good chance you’ll easily come up with a list of twenty or so before you start to run out of ideas.

That’s the point at which you should resort to the following tips and tools that’ll help you continue the brainstorming process of building your raw keyword list.

Be specific. When selecting your keywords, you want to avoid stand-alone words that are too general, such as travel. There are a couple of reasons for this. First, you will face very stiff competition. Big money sites like Expedia and Orbitz sites have already spent enormous sums of time and money to secure top positions for such general keywords. Knocking those sites out of the top results can be extremely difficult to impossible. Besides, it would also be unproductive from a sales-conversion perspective because people who buy things don’t typically search using only these general keywords.

General keywords like “travel” are so broad they could apply to all kinds of products and services—travel guides, travel insurance, travel accessories, and travel tours are just a few of the possible key phrases associated with the keyword travel. Unless you happen to sell every product and service related to travel, you shouldn’t waste your time and resources bringing traffic to your site that isn’t likely to buy what you are selling.

For example, let’s say you sell travel packages to Europe. Obviously, you want to attract European travel package buyers. But, rather than targeting the keyword travel, a much better keyword would be travel Europe or European travel packages. By targeting these much more specific keywords, you’ll bring a far more targeted prospect to your site—one that will be much more likely to find what they’re looking for and to actually buy from you.

However, when looking for keywords that are specific to your business niche, bear in mind that sometimes keywords can be too specific.

A rule of thumb is that you shouldn’t optimize your Web pages for keywords that none of your potential customers are using.

Instead, you should focus on keywords that are in the mainstream. Fortunately, there are free keyword tools available (that we’ll tell you about in a minute) to help you determine how many people are searching for any given keyword each month.

crocsPut Yourself in Your Customer’s Shoes, ask yourself…

What problem does my product or service solve for my typical customer?

Sometimes the difference between a company that succeeds and one that fails is simply a matter of talking to its customers and asking the right questions. A while ago we published a critically important article in this regard—The Missing Link to Writing Effective Ad Copy—it’s included in the Advanced SEO Tutorial portion of this course and you should study it.

Learning how to interview your customers can be the X-factor, the magic bullet, the missing link between failing miserably and succeeding spectacularly. These days people who shop online are abundant and growing. It isn’t hard to gather an informal group and watch as they attempt to locate a product or service within your company’s sales niche. If you’re selling a consumer product or providing a professional service, then friends and family could help in this regard. Sit down with them at a computer, ask them to find your products or services, and see what searches they perform. You may discover a keyword or group of keywords that you and your competitors have overlooked. Remember to keep the customer’s perspective in mind. Don’t make the mistake of assuming you know what customers call your products. Do the necessary research to find out what keywords that customers are actually using to locate your products or services. Learn to speak like your customers. Real people don’t generally use insider terms of the trade (aka, jargon) when searching. So, unless you’re selling to insiders within your own industry, you should avoid using industry trade terms. Think about words and phrases that real customers, not industry insiders, would use in a search.

On the other hand, if you are selling to industry insiders, then by all means, jargon away! Reading trade magazines is a good way to become familiar with industry catch phrases. You can also scour the indexes and glossaries of books about the business you’re in. Be sure to also browse the Internet forums that are dedicated to whatever specific industry, product, or service you’re targeting.

Glean Keywords from your Web site’s Referral Logs
Probably the most overlooked source of keywords is your Web site’s referral logs. This can be an indispensable source of feedback regarding what keywords your site visitors are using to find you.

Referrals coming from search engines will include the keyword query that a searcher used to find your site. People often search using some very creative search queries—terms that you and your competitors might never think to optimize for. Once again, this can give you a leg up on the competition, even in competitive fields, by enabling you to capitalize on overlooked highly targeted keywords.

Check out Your Competition
Once you’ve acquired a small list (shoot for about 30 keywords), start entering those keywords into searches on Google and Yahoo. Scrutinize the Web pages that are coming up in the search results—these are your competitors. Scouring their pages can help you uncover the keywords your competitors are actually targeting, some of which you may have overlooked.

You can also view the source code of your competitors’ Web pages to determine what keywords they’re optimizing for.

  • If you are using Internet Explorer (IE) then, in your browser’s menu, click View, then Page Source.
  • If you are a Firefox user, use Ctrl+U to view the source code.

Once you see the source code, inspect the title tag which looks something like this:

Baby Strollers – The best strollers and infant supplies for your baby

<title>Baby Strollers – The best strollers and infant supplies for your baby</title>

Notice the keywords sandwiched between the start tags. This title tag is where Web pages generally place their money making keywords.

A word of caution is in order here: There are court cases where the use of a competitor’s company name, product names, or trademarks, when used as keywords, is being interpreted as trademark infringement. Bear that in mind when scanning your competitor’s pages to brainstorm new keywords.

Cover All Your Keyword Variations look for variations on keywords you think might be successful. This includes plurals, synonyms, merged words, or keywords separated-by-hyphens.

Misspellings…Sometimes targeting common misspellings of your keywords can be an easy source of traffic. For example, one estimate says that 20% of Britney Spears related searches are misspelled (why are we not surprised?). In some cases, you may even find the misspelled or non-grammatical version of a keyword gets more searches than the keyword itself. For instance, let’s say that you’re optimizing your page for the keyword children’s clothing. Your keyword research shows there are actually more searches for the non-grammatically correct version childrens clothing when compared to the proper children’s clothing. Here is an instance where you should consider optimizing for both versions of the search term.

Never mind which one is actually correct. Your customers are always right. Whatever search term they are using to seek your product is functionally correct.

Of course, one must also take into account that Google and other engines have factored-in the reality that many people are lazy spellers. That’s why they offer their Showing results for: feature, as in…


…where they provide the option of clicking a link that leads to a corrected version of the search term’s results. Our research shows that most people actually click this corrected version of the link, since it is so conveniently found above the rest of the search results. Even so, there is traffic to be had from common misspellings of search terms. Whenever your offerings lend themselves to such, you should consider optimizing companion pages that glean traffic from bad spelling and other typical grammar mistakes whenever you know the terms involved are keywords that buyers use.

Keyword Variations
Plurals and Synonyms – Many search engines utilize a process called word-stemming to identify plural versions of a keyword. In theory, this means that a search engine should recognize charity and charities as being the same keyword. In practice, however, the search results for singular and plural versions of a keyword are rarely ever the same. This means that you should optimize for both versions by working them into the visible text on your Web pages.

The same can be said for common synonyms and descriptive terms. For example, a site selling auto parts would ideally optimize for variations on the keyword auto parts, such as car parts and automobile parts. In addition, they should also optimize for the various qualifiers (like best price, high quality, lowest priced) that buyers tend to use when searching.

Here’s an example of text that works all of the related synonyms with typically descriptive terms into a single paragraph focused on selling car parts…

Looking for the best price on car parts and accessories?

You’ve come to the right place.

We’re your vehicle’s one-stop source for the lowest priced auto parts and accessories.

If we don’t have the high quality automobile parts you’re looking for, no one does!

Merged and Hyphenated Words – Be aware that some keywords may be commonly merged or hyphenated. An example of a merged keyword would be webhost versus web host. In some cases, both the merged and unmerged versions will garner about an equal number of searches. In other cases, one will far outpace the other.

Hyphenated keywords, such as e-commerce versus ecommerce, should also be taken into account. Again, keyword tools are available to help you determine which variation is the more popular. Remember, search engines will treat them as different keywords. So, if your research suggests you should target both hyphenated and un-hyphenated keywords, be sure to work them both into your webpages and your Web pages 😉

Keep in mind that Google and Bing are both quite good at understanding that hyphenated and non-hypenated words often mean the exact same thing, and search results often are very similar regardless of the variation. You may see a variation in search, and therefore an opportunity may exist for the keywords your working should you address both variations.

Be Descriptive

Once you’ve covered all the variations of what you expect to be your most important keywords, begin adding descriptive terms to augment your existing terms.

For instance, cheap, low cost, affordable, or inexpensive can go with most consumer products, as can superlatives like best or cheapest.

Sometimes, using reverse descriptive words (words that describe the opposite of what your product does); can work to your advantage. For example, if you’re selling fast Internet connections, then slow Internet connection is at least as good a keyword as fast Internet connection, since a person typing the query slow Internet connection has a problem they’re actively seeking a solution for.

Use Action Words
Try to recreate in your mind’s eye how your typical customer conducts their various searches. It’s likely that many will use action words in their searches. Words such as buy, find, or purchase are examples of actions words that are widely used by buyers. Depending on your market, it may be well worth appending these types of words to your primary keywords as such:

  • Buy Droid Razr
  • Find Droid Razr
  • Purchase Droid Razr
  • Best Price Droid Razr
  • Free Shipping Droid Razr
  • Low Price Guarantee Droid Razr

Many searchers will also phrase their queries in the form of a question. For instance, the query, where can I buy a cell phone, actually receives a fair amount of traffic. As you grow your keyword list, consider using questions for which your site provides an answer in the form of a solution to their problem.

Target Local Markets
If your product or service is geographically relevant, then be SURE to mention the location in the text at every opportunity.

For instance, if your motel is in the little town of, say, Port Angeles, WA, then a normal sentence might begin as: The Uptown Motel boasts an unlimited panoramic view….
A better, keyword laden sentence would be:

“The Uptown Motel in Port Angeles boasts an unlimited panoramic view……even when the reader already knows it’s in the town of Port Angeles.”

When you’re selling to a local market, it helps to be familiar with local idioms and unofficial place names. For example, Philly vs. Philadelphia, Big Apple vs. New York, or Big Island vs.

But don’t leave out official place names. If you sell mobile homes in San Diego, make sure you optimize for california mobile home and san diego mobile home, in addition to so called mobile home. You’ll also probably want to pull in traffic from surrounding cities and counties, so you could add mission beach mobile homes, la jolla mobile homes, etc…

Break out a map and add those relevant place names to your keyword list.

Use Keyword Tools to Complete Your Selections
Once you’ve assembled your basic list, you’ll need to determine relative keyword popularity. You must know which keywords are the most popular as compared to other related keywords.

For example, if you sell coffee, you need to know if French Roast is more popular than Dark Roast, if decaffeinated is more popular than caffeine free, and so forth. Keyword tools are there to help you determine these differences.

Long Tail keywords are those 3 and 4-keyword phrases which are very, very specific to whatever you are selling. You see, whenever a customer uses a highly specific search phrase, they tend to be looking for exactly what they are actually going to buy. Very specific searches are far more likely to convert to sales than general generic searches which tend to be geared more toward the type and depth of research that consumers typically do prior to making a buying decision.

  1. Consumer becomes aware of a product.
  2. Consumer seeks information about that product in preparation for possible purchase.
  3. Consumer evaluates alternatives to product (features, pricing, etc…).
  4. Consumer makes their purchase decision.
  5. Consumer pulls out their credit card and completes the transaction.
  6. Consumer then evaluates the product after buying it and decides if they want to keep or return it.

Using the above six step process as our model, you can probably already see that you want to target the consumer who is somewhere around step 4.

Where do I get Wicked Cool Keyword Research Done?

These tools will provide you with ALL of the information you’ll need to sharpen and hone your keyword selection process.

Google AdWords Keyword Tool

The Google AdWords Keyword Tool is one of the easiest and cheapest ways to quickly assemble several hundred highly relevant keywords. Although originally designed to be used by AdWords PPC advertisers, this great tool is also available to anyone doing keyword research for organic search.

There are tons of features this tool offers, but we find the following four keyword research options most useful…

  • Keyword search volume (monthly number of searches by region).
  • Search volume trends over time.
  • Ad position and cost estimates.
  • Advertiser competition.

How to Estimate a Keyword’s Potential Traffic.

Of course, not everyone who searches for your keyword will click your listing. At best, even if your site occupies the top position, you can only expect about 40-50% of those searches will yield a click. So, if a keyword is receiving 4400 monthly searches (and assuming you think you can rank at the top), you’d divide the monthly searches by 30 (4400/30 = 146) then multiply by 50% (146*0.5 = 73). At best you can (very theoretically) hope for a traffic count of maybe 73 visitors a day generated by that keyword.

Google Trends

The purpose of Google Trends is to show how demand for different keywords fluctuates over time. It also differentiates what keywords are most popular by countries, subregions and cities. This enables you to more accurately target your advertising when using a PPC platform like Google AdWords or Facebook Social Ads.

Google Trends can provide data for a single keyword or enable you to compare multiple keywords. Entering a single keyword will display a chart as such:

Google Insights for Search

Google Insights for Search is basically Google Trends on steroids. It builds upon, and greatly expands, many of the demographic research features found on Google Trends. So why use both?

Well, Trends still offers useful information that Insights for Search doesn’t, like the Also visited feature. And they each tend to find keyword data that the other misses. So if you want to get the full range of (ahem) insights into your keywords, it’s useful to run your research on both tools.

Yahoo Clues

Yahoo Clues was launched in November 2010 as an alternate to Google’s Insights for Search and has had major updates as recent as June 29th, 2011. Yahoo Clues provides keyword trending data with a rich feature set for user demographics. It’s also a very good resource for finding related search terms.

YouTube Keyword Tool

YouTube has a great keyword research tool that people often overlook. Of course the data it provides will be more oriented towards video content, but it’s still a great resource to peer into the mind of your customer and understand better how they search for information. With the huge amount of traffic YouTube receives, it’s a very rich source of data for search engine marketers!

There are many different ways and search features available including:

  • Information based on a specific language or country.
  • Ability to search by Video ID to find keywords related to a specific video.
  • Search by Demographic which allows you to drill down by Gender, Age Range, Location and even Special Interests.

A search for the keyword welding, generates a 100 line report listing the top related searches done on YouTube.

<strong>Targeting Campaign!</strong>
The keyword tools tutorial will provide you with ALL of the information you’ll need to sharpen and hone your keyword selection process. These two tutorials will also give you all of the information you’ll need to precisely target your Web pages to the right audiences using the right keywords. Consider these tutorials to be a CORE part of the Advanced Section of this SEO course.

Keyword Placement: The Location of Your Keywords Count

There are numerous places on your Web page where you might place your keywords—and some page locations are much more effective than others. We’ll show you how keyword placement can make a big difference in terms of ranking well within the search results.

Title Tags

The most critically important location to place keywords is within your Web page’s HTML title tag. Search engines consider the keywords found in the title tag to be extremely important. These are the keywords that literally tell the search engines what your Web page is about.

Therefore, you should always place your most important keywords within the source code of your Web page’s <title> tag. You should also avoid wasting valuable space with words like your company name, unless your business is so well known that people use your company name as their primary keyword while searching for what you sell (like eBay, for example).
Another mistake that we commonly see in title tags is something like Welcome to our Home Page. This is pointless since nobody will be using that phrase to search for your site.

It’s hard to overemphasize the importance of keywords within the <title> tag located within your Web page’s source code. Here are two important points to remember:

Your Web page title tag is the most important aspect of Web page design in regards to ranking well on all search engines. The title tag tells the search engines what your page is about.
Your title tag is what Google and most other search engines use as your Web page’s link within the search results. It confirms to your potential site visitor that your page
has what they searched for.

Let’s say, for example, that you own a Bed & Breakfast called Kiluhana Inn, located in Hanalei Bay on the Hawaiian Island of Kauai. You should not use Kiluhana Inn as your title tag. If you do, your business will be handicapped in a search for anything related to Hawaii, or bed and breakfast, or Kauai, or Hanalei Bay, because none of those relevant keywords appear in your title. You’ll more than likely be buried in the rankings by your more knowledgeable competition.

A better title tag would be: <title>Bed & Breakfast Kauai – Hanalei Bay & Beach – Hawaii</title>

There are three reasons why this is a better title tag:

  1. Hawaii, Beach, Bed, Breakfast, Hanalei Bay, and Kauai are all keywords in your <title> that people are likely to enter when searching for this type of service.
  2. The keywords Hawaii, Beach, Hanalei, and Kauai are all terms that are entered when people are doing research related to your location. For instance, if someone does a keyword search for hanalei kauai your B&B has a good chance of showing up near the top of the search results.
  3. The name of your business, in this case Kiluhana Inn, is almost always very easy to rank well in the search results because business names tend to be somewhat unique which makes them less competitive as keywords.

Therefore, it is usually more than sufficient to place your business name within the normal body content (text) of your Web page. This alone will rank your Web pages at the top of the search results when searching for your business name. Stated another way, it is usually considered a waste of title tag space to place your business name within your Web page’s title tag unless your business name also happens to be the primary keyword that your customers are using to find your goods or services.

Take note that you should limit your title tag to 65 characters or less—usually about 7 to 10 words. Anything longer and you risk getting part of your title chopped off by some search
engines. In our example above, we might consider placing Kiluhana Inn at the end of our title tag only if it fits within the 65 character limit and there aren’t any better keywords to use in its place.


By the way, here’s a shortcut to help you find all of the other Web pages that are using your keywords in their titles. Go to Google and enter intitle:”put your title keywords here” into the search field. This will help you get a handle on how many other pages are competing for the same keywords.

While inserting your keywords in your title tag is very important, it’s also quite important to craft a title that makes people want to click the link. This means your title needs to appear to be high quality, relevant to the search query, and ideally invoke a bit of curiosity. The goal here is to not only rank highly, but to actually get users to click your link. Search engines track what your pages click through rates (CTR) are, and depending on how well your listing performs, that performance can also affect your search ranking. It’s understandable that search engines do not want to rank a page highly if nobody clicks on it, that tends to mean it’s a low quality result.



Header Tags

After your title, your Web page’s header (aka, headline) tags are the next most important place for your keywords. Header tags are specified with the following HTML source code:

<h1>, <h2>, <h3>, <h4>, <h5>, and <h6> tags.
Generally speaking, an <h1> tag (because it is typically a larger font), is considered more important than an <h2> tag, which is larger and considered more important than an <h3>
tag, and so on.

Since your header tags will appear as headlines on your Web page, it’s important that they look natural and appeal to customers who visit your site.

Good examples of keyword-rich header tags would look something like:

<h1>Your San Diego Real Estate Resource</h1>
<h2>For buying San Diego real estate and selling real estate in San Diego, we’re your one-stop source.</h2>

Body Text

Next on the chain of importance comes your page’s <body> text. This is the source code tag that refers to the visible text on your page. Think of this as your Web page’s general content that site visitors will be viewing. While it’s very important to place your keywords in page titles and headers, it’s also beneficial to feature your keywords throughout the rest of your page within the <body> content. Generally, Web pages should have about 200 to 300 words of text with special emphasis on two or three carefully chosen keywords. Within this keyword-rich <body> text, search engines respond favorably to keywords placed within boldface and italic fonts as well as bullet points. The style tags that look like <b>, <strong>, <i>, <em>, and <li> within the source code of your Web page.

Here’s an example of some keyword-rich body copy for a site that sells San Diego Real Estate:

<p>The <b>San Diego Real Estate MLS</b> is your source of information and services for anyone buying or selling <b>real estate</b> in <b>San Diego</b>. We specialize in <b>San
Diego real estate</b> and are committed to providing the expertise, professionalism and superior customer service today’s market demands. </p>
<li>Buying San Diego real estate?</li>
<li>Selling San Diego real estate?</li>

Once you get the hang of it, it’s actually very simple.

The above paragraph would display in your browser, on your Web page, as something like this:

The San Diego Real Estate MLS is your source of information and services for anyone buying or
selling real estate in San Diego. We specialize in San Diego real estate and are committed to
providing the expertise, professionalism and superior customer service today’s market demands.

  • Buying San Diego real estate?
  • Selling San Diego real estate?

Put us to work for you!

<p><i>Put us to work for you!</i></p>
In case that looks like Greek to you, then here’s a translation:
The <p> tag begins a paragraph, the </p> tags ends the paragraph
The <b> tag begins the bold typeface, the </b> ends it.
The <ul> tag begins a bullet section, the </ul> ends the T
The <li> begins a bullet point, the </li> ends it.
The <i> begins the italic typeface, the </i> ends it.

As you can see, the tags are invisible when the text is displayed on the Web page. Cool, eh?

Link Anchor Text

When another site links to you, the text they use in their link is called the anchor text. This is an extremely important concept to grasp because Google and the other search engines look for keywords located within the anchor text when ranking Web pages in the search results.

Getting your keywords placed within the anchor text of links that point to your pages will be a strategy that we will be discussing frequently within this book. It is arguably the MOST important ranking factor of all!

Here is an example of a typical looking link: Homeschool Learning Style Quiz

This link shows Homeschool Learning Style Quiz as the anchor text. The actual HTML source code for the link itself looks like this: <a href=”http://www.homeschoolviews.com/”>Homeschool Learning Style Quiz</a>

This link’s anchor text tells Google that the page located at: http://www.homeschoolviews.com/quiz/quiz.html is “about” Homeschool Learning Style Quiz.

And, if there happen to be a lot of Web pages on the Internet that link to this page using Homeschool Learning Style Quiz as the anchor text, then that page will rank well in the search results for any search query that uses homeschool learning style quiz.

In fact, this specific keyword strategy is one of the primary tactics for ranking at the top of the search results.

There are some caveats to this – because of heavy manipulation by marketers, Google regards high percentages of keyword anchor texts links to be spam.

Natural anchor text links tend to contain the domain name, company name, brand names, and specific URLs.

When a site has a large percentage of inbound links containing a specific keyword phrase it can trigger a penalty. Yes, you want keyword anchor text links, but they should not often be more then 25% of your inbound links as a general rule of thumb. For example, if your company name is Good Times Realty and your offices are in San Diego, if you have a high percentage of your inbound links says “San Diego Realtors”, then that may very likely cause you problems with Google. If the majority are “Good Times Realty” and goodtimesrealty.com, that would be a more natural link profile.

The Higher Up on the Page, the Better

It’s very important that you place some of your best keyword-rich text as high up within the visible content of your Web page as possible; we often refer to this area that you can see without scrolling as “Above the fold”. Search Engines regard above the fold content as the most important on the page. They often rank pages that are have a lot of ads above the fold lower than those that put their content in the primary visible area.

This means placing your keywords within your first headline (aka, header) tag (<h1>, <h2>, etc.) and in the first paragraph of your page.

Styling HTML Tags

Whenever the layout of the page allows, you should place a sentence or two of text containing the primary keywords near the top of the page in an <h1> tag . Important tip: At the risk of sounding complicated, you can use Cascading Style Sheets (CSS) to alter the standard appearance of any tag. In such cases, <h1> tags, which are normally very large, don’t actually have to be large. Bold <b> tags don’t necessarily have to actually make text look bold and links can even be made to not look like links. It all depends on whatever style you’ve assigned to the tags within your Web page’s associated stylesheet.css file.

The logic for using an <h1> Headline tag is to lead Google and the other engines to believe that the keywords located within the tag are very important. However, you may find, and we agree, that the <h1> tag makes ugly headlines because they are far too big. That’s where CSS comes to the rescue by making the <h1> tag look like a reasonably-sized people-pleasing font – but without sacrificing the ranking advantage you would otherwise have if you had used the default big <h1> headline tag.

Obviously, these are tricks of the trade that require a bit of understanding of HTML and CSS. If you are fluent in this so-called ‘markup’ language, that’s great. If not, then pass along this info to your technical Web people. Let them perform these worthwhile tricks. And, if you want to learn how to do it yourself, here are a couple of (separate from THIS course) tutorials that can bring you quickly up to speed.


CSS is a powerful design tool for formatting Web pages that are pleasing to the eye of your site visitors while maintaining your competitive edge in the search engine rankings game.

PLEASE NOTE: It isn’t so important that YOU actually know how to “do” CSS. It’s only important to know that you should consider working with, or hiring someone who knows CSS and that you show them this section so they can see how to use CSS for SEO when developing your Web site.

Use a Small Number of Keywords on Each Page

In most cases, each Web page should be focused on no more than two or three keywords and these keywords should be related to each other. There are a couple of reasons to limit the
number of keywords per Web page:

  1. Your most important keywords should be placed into your Web page’s title tag. Since a title tag should be limited to no more than 60 characters, this functionally limits the number of keywords that can realistically be placed within it.
  2. If you optimize a page for too many keywords, you’ll end up diluting the focus of that page in respect to those keywords. Each page on your site should be tightly focused to rank very highly for a specific set of terms. If you want to rank for a greater number of keywords, then you should increase the number of Web pages on your site.

This doesn’t mean that your page won’t rank for other related terms. Oftentimes keywords overlap. Ranking highly for one keyword can also help your page rank highly for a whole host of related keywords. For instance, if your page ranks highly for the keyword direct marketing, then it’s likely to also rank highly for professional direct marketing or direct marketing services, assuming those keyphrases appear somewhere in your <body> text; the viewable content of your Web page. Piggy backing related terms onto your primary keywords like this is a good way to boost your Wed page rankings for a broader range of searches without diluting the focus of your pages.

<Images Alt=” “> Tag – Use it wisely and quickly turn Images into Assets

Your company logo may show what you are, who you are, and even state a benefit—but the engines can’t index your image (not for keyword purposes, anyway). The search engine’s indexing-bot is oblivious to everything but text. The only indexable keyword aspect of images is the text content you place within the <image alt=”put text here”> tag. Regardless, you can turn all images into keyword assets by placing keyword text within the Alt tags. Here’s an example.

<img src=”logo.jpg” alt=”Beachfront Hawaii Vacation Rentals – Big Island”> Notice how we’ve placed the keywords Beachfront Hawaii Vacation Rentals – Big Island into the alt=”…” portion of the image tag. Although the search engine cannot “see” or read the image, it most certainly can read the alt portion of the image tag. This enables us to tell Google or any other search engine what that image is about and get just a little bit of keyword relevance help from an image that the engine could not otherwise read.

Use of the alt tag also is very effective in helping your images rank well in Search Engines Image Search, so it should not be ignored.

Bear in mind that you shouldn’t expect a big ranking boost from this tactic—in fact you may get none at all. Including image alt text is however a optimization technique that even Bing suggests you use for better ranking.

Three more reasons for using the alt tag are:

  1. Web browsers designed for the blind make use of the alt text information to help describe what the content of the image is.
  2. When you make an image a link, the alt text functions as anchor text and can therefore influence the ranking for the target page similar to how text based anchor text works. Typically text links are regarded as better for this purpose, but if you have to use an image for your link, make sure to include the similar keywords optimized text in the alt text.
  3. The latest HTML specs require that images have an Alt tag, failure to include this information will cause validation errors. In essence, using the alt tag can sometimes help, and will never hurt your ranking and web design efforts. Therefore, you should use the alt tag whenever doing so holds any chance of making an image keyword-meaningful and thereby stacking the advantages in your favor.

In essence, using the alt tag can sometimes help, and will never hurt your ranking and web design efforts. Therefore, you should use the alt tag whenever doing so holds any chance of making an image keyword-meaningful and thereby stacking the advantages in your favour.

Image File Names

One of the most important signals for ranking within image search is the image’s file name. Not only do image file names help describe the image, they do double duty as anchor text when someone hot links direct to the image. Ideally your image’s file name should describe the image to a certain extent, but don’t get carried away. We’re sure there’s some part of the algorithm that says – as the number of dashes or overall length of the file name increases, so does the likelihood that it’s spam, just like a Web page. So short and sweet is the best approach here.

Image ALT Attribute

ALT text is an HTML img tag attribute, also at times incorrectly referred to as the Alt Tag, that stands for Alternative Text. It was originally used to provide alternative text to the reader in place of the image, such as when someone is using a text only browser, or a visually impaired person using a screen reader.

Search engines found ALT text was a great way to figure out what an image was about, so early on all of them indexed the text within the alt attribute, similar to visible text on the page and the practice continues to this day. When image search came out, engines continued to use the alt attribute for image ranking purposes as it’s one of the most frequently used ways to describe an image.

Here’s an example of the Alt Attribute in use:

<img src=”http://www.domain.com/images/droid-razr.jpg” alt=”Droid Razr” />

The alt attribute is one of the primary ranking signals for image search and is used by all image search engines. Do not ignore it!

Keyword Density: An Enduring SEO Myth

Worth mentioning is the often-misunderstood concept of keyword density. In its pure form, keyword density refers to the number of times a keyword appears in relation to all of the other words on the same Web page. For instance, if a page only contained one word of text, say…
Chicago, the keyword density for the keyword Chicago would be 100%. On the other hand, if the only text on the page was
Eat at Chicago’s finest seafood restaurant …then the keyword density of the keyword Chicago would be 20% since each word on the page represents one-fifth of the entire text. By the way, search engine’s ignore common stop words such as the, at, of, etc. – therefore the word at would not be included in our keyword density calculation.

Optimum Keyword density is one of the tactics that some search engine optimizers (SEOs) place way too much emphasis on. They’re usually under the mistaken impression there is some magic formula for calculating the optimal keyword density that will appeal to each search engine. While this was true in the past, it has effectively ceased to be a factor anymore. At best, keyword density is only a bit-player in the big algorithmic search engine formula for top ranking pages and no longer worth the effort to factor into your strategies. Regardless, you may still hear stories that Google prefers pages with a 5% keyword density or Yahoo likes pages with an 11% keyword density. There are, however, a number of reasons why this is not an effective strategy for optimizing your Web pages.

First of all, the concept of keyword density doesn’t take into account the location of the keywords on the page. As you learned in the previous lesson, keyword placement is an important element of optimizing for search engine ranking. To say that a page has a 10% keyword density says nothing about whether those keywords are featured in your title tags, header tags, link anchor text, or any other of the important places to feature your keywords.

Secondly, keyword density also ignores the distance between keywords on a page, a concept known as keyword proximity. In general, the closer your keywords are to each other, the better. For instance, the phrase: Your premier resource for San Diego real estate information …is better optimized for the keyword San Diego real estate, than the following phrase: Your premier real estate information resource for the San Diego area And finally, our analytical research of top-ranking pages in any search engine shows an enormous variation in the keyword density of those pages. Some top-ranking pages have a 50% keyword density. Others have as low as 0% keyword density. Indeed, we’ve found a few pages that rank highly for a keyword in spite of the fact the keyword doesn’t even appear on the page!

In such cases, it’s the keywords in the anchor text of external site links that point to the page that’s causing it to rank at, or near, the top. This alone illustrates just how important it is to get your keywords into the anchor text of offsite links pointing to your Web pages!

As you might imagine, such a large degree of variation makes it all-but-impossible for anyone to determine just exactly what the “ideal” keyword density actually is. Restated simply, you should insert your keywords into the natural flow of descriptive text without wasting time stressing over the exact number of times a keyword should appear on a Web page.

Final Review

  1. You’ve gained a detailed understanding of the importance of keywords: what they are, how to find them and where to place them.
  2. You’ve learned there are three kinds of ‘searchers’ and how certain keywords (actually keyphrases) appeal more to a specific type of searcher who is ready to buy.
  3. You know there is a difference between general keywords and money keywords that trigger sales.
  4. You’ve been given 10 steps for finding all of the right keywords.
  5. You’ve learned exactly where to place your keywords.
  6. And you’ve been given an explanation of Keyword Density and its associated overall lack of importance.

Lesson 2

google-bing-yahoo-08-24-101Google, Bing and Yahoo 95% of all searches take place on these three major engines.

Not only is Google the sole dominant player in its own right, it also provides search results for Internet service providers (ISPs) like AOL and Time-Warner as well as many other lesser ISP’s. So, throughout this course, when we use the phrase ‘search engines’ we are speaking of Google, Bing and Yahoo—in that order. And the heavy emphasis will always be on Google. If something is unique to either Yahoo or Bing, we will clearly state it as such. Otherwise, think Google!

Relevancy: The Critical Ingredient of High Ranking Web Pages
It’s important to understand that search engines make their money by showing ads. Basically, that’s their entire profit model. This means they need to show ads to as many people as possible. And the way they get the largest number of people to use their search engine is by giving them the most relevant search results possible.

Think about it if a search engine gave you pages about a completely different topic than the one you just searched for you’d probably decide to use a different search engine. That would be bad business for the search engine company.

Making Pages Relevant

As an SEO your job is to make YOUR Web pages the most relevant pages available for your business related keywords. There are a number of ways to establish your Web page’s relevancy in the eyes of the search engines.

On-page relevance
On-page optimization involves placing keywords in strategic locations throughout your Web pages so that search engines know to associate those keywords with your Web page.

Off-page relevance
These off-page strategies relate to pages that link to you from other sites.

Off-page strategies include

  • Anchor text: the actual keywords you click in a link that point to your Web site
  • Keyword text within the paragraphs surrounding that anchor text,
  • Keywords within titles of the pages that link to you,
  • Keywords within the body content of the pages that link to you,
  • Directory categories your site is listed in,
  • Directory categories of the sites that link to your page,
  • The authoritative strength of the sites that link to you,
  • The authoritative strength of the sites that link to the sites that link to you.

These along with many lesser elements all add up to a successfully optimized Web site. Of all the off-page strategic factors listed above, the inbound link anchor text is currently the most important, but they all play a collective role as factors that add relevancy to your Web pages.

Getting Your Web Pages Listed

Your first steps to getting listed in a search engine are actually very straightforward. After you’ve finished reading this book, compiled your keywords, analysed your competition, and built your Web pages, your next step must be to obtain a link to your site from another site that is already listed in Google!

“Why not just submit my site to Google?”
..here’s why: Google prefers to find pages on its own by following links from other sites. Google places more trust in pages that it finds naturally through links that are pointing at it than it does in pages that are submitted to them.

Being listed in the index of any of the major search engines legitimizes your site in the eyes of the others. And, once you have pages indexed in any of these engines, you’ll have your very own avenue of entry for new pages and sites. By simply placing links to these new pages from your own Google-known pages, search engines will find these strategically placed links the next time they visit your site to update their index.

Avoid the “submit your site to thousands of search engines” services like the plague.

What about Paying for Instant Traffic?

If your budget can afford it, and you’re looking for immediate traffic to your Web site, then pay-per-click marketing is one way to help build your company’s immediate web presence.

A Pay-Per-Click (PPC) program (Google’s AdWords is a PPC program) can have an ad for your site listed on the front page of the search results (in the Sponsored Listings section) and sending traffic to you within a matter of minutes for a price.

PPC is not something to be entered into lightly. Since you’re paying for every visitor a search engine sends your way, it’s possible to unwittingly rack up expensive click-charges if you’re not careful.

PPC strategy we can chat about beyond the scope of this course.

Organic vs. Sponsored / PPC Search Results

Note : The Sponsored Listings that you see on the right hand side of the search results are what we call pay-per-click (PPC) ads. If driving traffic to your site quickly is your main goal, then there’s nothing faster than PPC.

Google’s AdWords informational page, is here https://adwords.google.com.

Getting listed in Google’s organic search results is free; but is not nearly as fast as getting listed in the paid Sponsored Listings via Google AdWords. Bear in mind, that ranking in Google depends greatly on how many inbound links a page has—and it typically takes some time to accrue links.

Google assigns relevancy to pages based on a proprietary and continuously evolving database algorithm that takes into account the following weighting factors.

  • Incoming links
  • Keywords in URL
  • Page Content
  • Link Content
  • Page Meta Title
  • Page Rank Score

Some of these factors are on-page, some of them are off-page and some of them are performance based. For instance, if another Web site links to your Web page with the anchor text, budget widgets, then Google tends to believe that your site is about budget widgets regardless of what content your page actually has on it. This factor is known as page reputation; and Google places a lot of weight on page reputation. Google also likely uses various methods to monitor how popular your site is, and the quality of your content. They use your pages bounce rates, time on page, time on site, and USG (User Generated Content) such as comments to judge the level of activity your content has.

To rank well on Google involves optimizing as many of these on-page and off-page factors as possible. If this all sounds a bit complex, relax. It’s easier than it sounds and you’ll have a big advantage over your competition once you’ve absorbed the information in this course.

What about Search Directories?

There are two main search directories and several niche directories that can provide you with a valuable listing. Unlike the three major search engines, directories are more like catalog listings and they require that you actually submit your site (and sometimes pay a fee) if you want to be listed.

Submitting to directories is a great way to begin acquiring links and driving traffic to any site, new or old but one needs to be selective as many low quality directories exist that do not offer any great value to your website.

When selecting a directory one needs to simply make sure that you can tick some quality criteria check boxes before wasting your time on the millions of spammed sites.

Does the directory site rank?

Is the directory relevant to my site?

These are what I call hyper focused regional or product specific directories. They are highly relevant to a locations / products and services in South Africa.

when submitting to these relevant directories and generic top directories like yahoo and dmoz.org make certain that you have submitted to the the most relevent internal category or you will stand little chance of getting accepted. (be specific)

Getting a listing in DMOZ is still as important as always, in terms of gaining a valuable high PageRank link that carries trust. Be warned – submission to DMOZ can be frustrating. As with Yahoo, we also recommend finding a category (if possible) that is in the index of Google and Bing by using the “site:” search command site:www.dmoz.org/category search to see if the page is indexed.

Final Review

  • We’ve identified the top three, and only important, search engines as Google, Bing and Yahoo (Powered by Bing); and in that order.
  • We’ve revealed how Web pages actually get listed in the search engines by being found, not by being submitted.
  • We’ve explained the difference between free, organic (aka, natural, regular) search results and Sponsored Listings which are paid search results also known as pay-per-click or PPC.
  • You now have a basic list of ranking elements that Google uses. They are; incoming links, page content, Web page title tag, keywords in URL, anchor text (i.e. link content), PageRank score; among others and not necessarily in that order.
  • And you’ve been introduced to the two main search directories (Yahoo & DMOZ). Now you know there’s a difference between search engines and search directories.

Lesson 1

Are You A Wizard - Funny Kitty Face

Before we embark into each of the course lessons it might be worth mentioning that SEO (Search Engine Optimisation) is a something that is governed by Google and their employees and their shareholders. Diving into SEO training without reading and understanding the Googlers official stance is somewhat silly so we have devoted lesson 1 to “Google’s Perspective”.

We copied and pasted the 5 most important pages Google on the topic of webmastery and SEO. You will need to read and understand each of these sections.

  1. Google Basics
  2. Search Engine Optimization (SEO)
  3. Paid Links
  4. Webmaster Tools
  5. Link Schemes

The Rest of Lesson 1

This course leveraged off may years of commercial SEO experience from all the industries top experts. Tried and tested techniques and the most up-to-date strategies in the search industry today. By cutting out the hype and heresy we have constructed a simple factual guide that will assist anyone from a novice to expert level in the shortest time frame possible. (So think about this course as the ultimate cheat sheet earned by years of hard work.)

By learning the glossary and acronyms page in any IT certification course you can tick off about 20 to 30% of the test marks. Knowing the acronyms (even if they are initially confusing) instantly empowers to have the upper hand in wizardry conversations with other “SEO experts“, perhaps you want to baffle a few clients. So learn/read/study this uber cool glossary of terms and acronyms  coz you are going to need it floating round your head sooner or later.

Let’s Kick it Off!

Even if you’re an Internet beginner, the information contained in this course will start making sense to you once you begin to familiarize yourself more and more with the Internet, web design, and source code.

In fact, you’ll likely even start to gain a direct advantage over many of the so-called pros simply due to the dynamic nature of the industry and the collaborative nature of this course outline.

This course is divided into eight core lessons.

The intention is to train these 7 lessons in 8 full day workshops, however you can do just as well by reading the course and following the lesson plans on your own.

It is highly recommended that you complete the course prior to starting any “SEO magic” on real live client websites. Please read through the official Google doctrine on what they find as acceptable rules of engagement before embarking on SEO strategies AKA the counter attack.

This course is intended to boil down the bullshit and drill over 14 years practical experience from countless experts into a guide that takes the reader from noob to pro. Sure we copy-paste from some key Google documents,  hack videos from youtube and quote key industry figures to get the job done.


SEO Glossary of Terms

green-question-100x105Terms, abbreviations and acronyms used in the Internet marketing, search engine optimization (SEO), and search engine marketing (SEM) industries..

Use your browser’s Find command (try control-F) to quickly locate terms you’re interested in.


0-9 | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

.htaccess file
In several web servers (most commonly Apache), .htaccess (hypertext access) is the default name of directory-level configuration files that allow for decentralized management of configuration when placed inside the web tree. .htaccess files may contain any number of allowed configuration directives and follow the same syntax as the main configuration files. Directives placed in .htaccess files apply to the directory where you place the file, and all sub-directories, unless disabled in the main configuration. The file name starts with a dot because dot-files are by convention hidden files on Unix-like operating systems. A subset of Apache’s .htaccess syntax is also supported by other web servers, such as Sun Java System Web Server and Zeus Web Server. [source: http://en.wikipedia.org/wiki/.htaccess]

301 redirect
A server header that tells web browsers and search engines that a page or site has been permanently redirected to a new location or URL.

A/B Testing
A strategy that involves providing alternative (usually two) choices to a user, and analyzing the difference in perception and behavior as a result of the differences between each item. This is also known as split testing.

Above the Fold
The area of email or web content visible to the viewer without them having to use the scroll bar.

Absolute Link
A link which shows the full URL of the page, that includes the transfer protocol, domain name and often a file name. Compared to a relative link, which only shows part of the information. Absolute links are preferred because they preserve the integrity of the website, its content and are less prone to link errors. Absolute links still function even when the document is not located at its typical location, such as when a user saves a page to their local computer.Example of absolute link:
<a href=”http://somesite.com/somefolder/somefilename.html”>Absolute Link</a>Example of relative link:
<a href=”../somefolder/somefilename.html”>Relative Link</a>

The practice of making websites contents available to disabled people, in most cases visually impaired people, via the use of alternative technology or features to present the information. Implementing accessibility improves search engine ranking because they have a richer information content.

Acquisition Strategy
A plan or process used to identify and attract potential customers who are looking to purchase a product or service.

Used to refer to advertisements a searcher sees as part of the result of a query.

Ad Copy
The text occupying the second and third lines of a displayed Ad, that provides a brief description of the Ad Title.

Ad Group
An ad group is a set of ads (one or more) targeting a set of keywords, placements, or both. There are settings for bid, price as a group and individually.

Ad Title
The headline of an Ad, displayed on the first line of clickable or context-served ad.

Microsoft’s version of the cost per click ad network, designed to rival Google AdWords. Still relatively new in comparison, but contains many useful features and has greater potential due to its smaller current market share.

Google’s implementation of contextual advertising that allows publishers to display relevant Google ads on their website content in a profit sharing partnership. This has the effect of encouraging or promoting people who access the website to click on the advertising links.

AdSense Site
Websites that are especially designed to display Google Adwords ads (or other ad networks). In general the term describes a site that’s sole purpose is to generate PPC advertising review from the traffic that hits it, will all other purposes secondary. Content quality on a site built for this purpose typically is low to encourage people to click the ads.

Google’s advertisement and link auction network. Designed to work with Google’s keyword targeted advertisements, it provides advertising links for auction sold on a cost per click basis and other factors that improves traffic to the website.

AdWords Site
Websites that are especially designed for use as Google Adwords traffic landing pages. Typically this is done for either branding and/or to help segment traffic for higher accuracy in tracking measurement.

An affiliate is a website marketing or advertising products and services of other websites or businesses for a set fee or commission.

Affiliate Marketing
A strategy that allows merchants to expand their market share by paying independent agents an agreed fee or commission for advertising or marketing their products and services on the agent’s website.

Refers to the ‘freshness’ of the web page or its content. Used by social networks or search systems to determine the value or relevance of the information offered. Usually this includes the site age, page age, user account age, and other historical data.

Agent Name
Used to identify the name of the Crawler/spider or other automated programs that is currently visiting a page. The data is generated during a GET request to a server using the HTTP_USER_AGENT field. This information is available to the server, programs running on the server and will also be logged often in the server logs for each URL request.

(Also known as) Asynchronous JavaScript And XML – A technique that allows programmers to provide a more responsive and interactive web application by facilitating the JavaScript scripts to interact with the server without having to reload the webpage.

An Amazon.com owned search service dedicated to providing metrics to measure website traffic.

A set of rules used by search engines to determine the order and relevance of contents to present to the user based on their search query.

Search engine bought and currently used by Yahoo as a test bed for new search technologies and features.

Alt Attribute
A text equivalent representation of an image on a web page, used to help search engines or blind people to understand the context and information of an image. Used with the img src http tag, it provides search engines an important way of understanding what the image is about.

Alt tags
Text associated with a web page graphic that is displayed during a mouse over event, designed to improve the accessibility of the website content. Alt tags describes the img src alt attribute and are often used to help with search engine ranking.

Alt text
A description associated with of a graphical content that is displayed when the graphic is inaccessible. Used by software and programs that relies on this information to manipulate, interpret or distinguish between different graphical contents.

Once a popular search engine and currently owned by Yahoo. Technical problems and brand mismanagement resulted in a significant loss of market share.

The most recognized and popular internet retailing website. Known for its rich, consumer generated media and its ownership of many other popular websites, such as IMBD and Alex.

Programs that facilitate the collection and analysis of data related to website traffic and user behaviour/profile.

Anchor Text
The textual component of a hyperlink. If the link is an image then this is substituted by the image alt attribute. Search engines places emphasis on the relevance of the anchor text in relation to the link in their algorithms, so it is very important to optimize your anchor text, both on site and off site for improved search engine ranking.Example: <a href=http://www.domain.com>ANCHOR TEXT</a>

Animated Ad
An advertisement that incorporates movement or animation, usually implemented as an interactive Java applet, or Shockwave or animated GIF file.

To add a link to a website already indexed by search engines to improve the visibility and traffic of the existing website.

(Also known as) America Online – Popular web portal which merged with Time Warner.

(Also known as) Application Program Interface – The set of protocols and standards used to access software functions.

The process of buying and selling a commodity for profit by identifying and exploiting inefficiencies or flaws in the market.

A search engine originally known as Ask Jeeves, but changed name in early 2006 and currently owned by InterActive Corp (Nasdaq: IACI).

(aka) Active Server Pages – A Microsoft proprietary programming language designed to allow programmers to create more dynamic web sites.

The act of pushing a commercial or political agenda under the guise of an impartial grassroots participant in a social group or network. A common practice is the participation in a user forum while secreting carrying out other activities such as branding, customer recruitment, or public relations.

Auction Model Bidding
Also known as market or competition-driven bidding, most popular amongst PPC bidding. The rules for bidding is based on the amount of competition for the bid. If there are no competitors, the advertiser pays their bid amount or less for every click. If there is competition for the keyword, then the winner is determined by the person having the highest bid price, but they only pay an amount that is one step up from their nearest competitor.

The characteristic of a page or domain to rank well in search engines by virtue of its link equity, site age, traffic trends, site history, and content quality. These are the main factors that are incorporated into search engine algorithms to determine the result of a search query. Also used to refer to websites that are trusted and well-cited as a result of having high authority.

Automated Bid Management Software
Software that optimize the management and control your ad spending by analyzing various metrics and data to help you work out the best options for advertisement spendings.

Automated Submitting
The use of automated software to submit your web pages to the search engines in order to improve web page ranking and statistics. This practice is now circumvented by search engines requiring users to submit one-time codes in the form of graphics (CAPTCHA Program) on the submission page.

Automatic Optimization
The process where search engines preferentially display advertisements with the highest click through rate as determined by an algorithm over a period of time.

(aka) Business to Business – Refers to one business communicating with or selling to another business.

(aka) Business to Consumer – Refers to a business communicating with or selling to an individual customer rather than a company or another business.

Incoming links to a website or web page. The number of backlinks is an indication of the popularity or importance of that website or page. In basic link terminology, a backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node. Backlinks are also known as incoming links, inbound links, inlinks, and inward links.

Bait and Switch
A sales tactic in which a bargain-priced item is used to attract customers who are then encouraged to purchase a more expensive similar item. It’s also a SEO technique that provides one page for a search engine or directory and a different page for other user agents at the same URL. Sometimes it creates an optimized page and submits to search engines or directory, but replaces with the regular page as soon as the optimized page has been indexed. This method was popular in the 1990’s when search engines required manual resubmission before they would respider a web page. This technique is not effective with today’s search engines and has the potential to get your website banned.

Also called Delisting. This is an action taken to remove website and or web page listings from the index of a search engine. Being banned is the worst penalty imposed by a search engine in response to spam or violation of specific guidelines.

Banner Ad
Internet advertising that invites the viewer to click through to the advertiser’s web site by clicking on the banner. Banner ads may contain animation and sound. In addition to accessing another web site, banner ads may also collect information from consumers, make sales, or offer activities such as games. Banner ads are usually placed in a thin, rectangular box (468 _ 60 millimeters is standard) at the top or side of the home page. Response to banner ads is generally measured by click-through rates. Currently banner ads are the Internet equivalent of a direct-mail envelope, enticing the reader to seek more information about the contents of the envelope or web site.

Banner Blindness
A phenomenon in web usability where visitors on a website ignore banner-like information.

Baseline Metrics
Calculations and benchmarks performed to provide a basis for comparing past performance to current performance.

A line of code placed in an ad or on a web page that helps track the visitor’s actions, such as registrations or purchases. A web beacon is often invisible because it’s only 1 x 1 pixel in size and has no color. Also known as web bug, 1 by 1 GIF, invisible GIF or tracker GIF.

Behavioral Targeting
The practice of targeted advertising to groups of people who exhibit similarities in their habits and behavior on the Internet.

Bid (pay-per-click)
The maximum amount of money offered by an advertiser for each time a searcher clicks on an ad.

Bid Boosting
An automated bid management system that adjusts your bid amount depending on the profile of the user that the ad is displayed to

Bid Management Software
(aka) Bid Management Tool – Software or an ASP service that automatically manages PPC campaigns by monitoring and adjusting parameters to optimize the bidding process on pay-per-click search engines such as Yahoo Search Marketing and Google AdWords.

MSN’s new search engine that went live in June 2009. The name is a play on the phrase “bada-bing” stating that something is easy or simple to accomplish.

Black Box Algorithms
A algorithm that can only be seen in terms of its inputs and outputs but not the internal details.

Black Hat SEO
Optimization tactics that can cause a site to rank more highly than its content would otherwise justify and are made specifically for search engine ranking and does not improve the user’s experience of the site (the opposite of White Hat SEO). Black Hat SEO are optimizations that are specifically against search engine guidelines, or frowned on by search engines completely. If you step too far over the mark, your site may be penalized or even removed from the index completely.

A list that either search engines or independent users compile of search engine spammers or websites that practice fraudulent operations or insidious activities that affect users of the Internet negatively.
These lists can be used to ban those spammers from search engines or to boycott them.

Block Level Analysis
A method used to break a page down into multiple points on the web graph by breaking its pages down into smaller blocks.

Short for ‘web log.’ A website that displays in chronological order the postings by one or more individuals and usually has links to comments on specific postings. It’s usually displayed in a journal like way and typically reflects the personality of the author or web site.

A free blog platform owned by Google.
It allows you to publish sites on a subdomain off of Blogspot.com, or to FTP content to your own domain.

Body Copy
The main textual content of a web page visible to users and does not include information hidden in the HTML source code or navigation.

A stored location for quick retrieval at a later date. Web browsers provide bookmarks that contain the addresses (URLs) of favorite sites. Most electronic references, large text databases and help systems provide bookmarks that mark a location users want to revisit in the future.

(aka) robot, spider or crawler – A software program that imitates the behavior of a human, as by querying search engines or participating in chatroom or internet relay chat (IRC) discussions. Search engines use bots to find and add web pages to their search indexes.

Bounce Rate
The Bounce Rate for a single page is the number of visits who enter the site at a page and leave within the specified timeout period without viewing another page, divided by the total number of visits who entered the site at that page. In contrast, the Bounce Rate for a website is the number of web site visitors who visit only a single page of a website per session divided by the total number of website visits.
Bounce rates can be used to help determine the effectiveness or performance of an entry page. An entry page with a low bounce rate means that the page effectively causes visitors to view more pages and continue on deeper into the website.

Branded Keywords
Keywords associated with a brand. Typically branded keywords occur late in the buying cycle, and are some of the highest value and highest converting keywords. Some affiliate marketing programs prevent affiliates from bidding on the core brand related keywords, while others actively encourage it. Either way can work depending on your business model and marketing savvy, but it is important to ensure there is synergy between internal marketing and affiliate marketing programs.

Applying a trade name to a product or service. It also refers to developing awareness of the name. Branding is always important, but in the early days of the Internet, it was a major hot topic and tactic. Companies spent a fortune attempting to gain market awareness, no matter how much money they lost.

Breadcrumb Navigation
Breadcrumbs or breadcrumb trails are a navigation technique used in user interfaces. Its purpose is to give users a way to keep track of their location within programs or documents. The term is taken from the trail of breadcrumbs left by Hansel and Gretel in the popular fairytale.

Brin, Sergey
Sergey Brin – The Co-founder of the search engine Google.

Broad Match
Broad Match is a form of “keyword matching” and refers to the matching of a search listing or advertisement to selected keywords in any order.

Broken Link
A hyperlink that does not lead to the desired location.

The user interface on a computer that allows the user to navigate objects. The most commonly used is the Web browser, which is used to access the World Wide Web. Examples of a web browser are Mozilla Firefox(FF) and Internet Explorer(IE).

A well trusted directory of business websites and information.

Buying Cycle
Before making large purchases consumers typically research what brands and products fit their needs and wants. Keyword based search marketing allows you to reach consumers at any point in the buying cycle. In many markets branded keywords tend to have high search volumes and high conversion rates.
The buying cycle may consist of the following stages
Problem Discovery: prospect discovers a need or want.
Search: after discovering a problem look for ways to solve the need or want. These searches may contain words which revolve around the core problem the prospect is trying to solve or words associated with their identity.
Evaluate: may do comparison searches to compare different models, and also search for negative information like product sucks, etc.
Decide: look for information which reinforces your view of product or service you decided upon
Purchase: may search for shipping related information or other price related searches. purchases may also occur offline
Reevaluate: some people leave feedback on their purchases . If a person is enthusiastic about your brand they may cut your marketing costs by providing free highly trusted word of mouth marketing.

Buying Funnel
(aka)Buying Cycle, Buyer Decision Cycle and Sales Cycle –
A multi-step process or decision tree outlining the path leading to the customer’s purchase of a product.

Buzz Monitoring Services
Services that notify a client of their company’s presence and status on the web, usually due to the reference to their name, personnel, products or services.

Buzz Opportunities
Media savvy and popular topics which give a company or brand opportunity access to a targeted audience who can generate further increase the exposure of the company or brand.

Copy of a web page stored by a search engine. When you search the web you are not actively searching the whole web, but are searching files in the search engine index.

Calacanis, Jason
Jason Calacanis – Founder of Weblogs, Inc. who pushed AOL to turn Netscape into a Digg clone.

Campaign Integration
Planning and executing a paid search campaign along with other marketing online or offline initiatives.

Canonical Tag
A new META tag designed to allow siteowners to specific original content to the search engines on a page-by-page basis. The tag is supported by Yahoo, Google and MSN’s Bing search engine.

Canonical URL
The canonical version of any URL is the version that is considered by the search engines to be the single most authoritative version of a page.

Cascading Style Sheets (CSS)
A standard for formatting the look and feel of a web page, including information on paragraph layout, font sizes, colors, etc.

Catch All Listing
A listing used by pay per click search engines to monazite long tail terms that are not yet targeted by marketers.

(aka)Common Gateway Interface – Software used to interface between a web server and other machines or software running on that server.

Child Sitemap
Generally a child sitemap refers to a themed or group specific sitemap.Sitemaps can generally only hold 500 page listings so you have to break them into sections if you have a really large site like Amazon.There is a main or “parent” sitemap that has all the sections followed by the “child” sitemaps that list all the contents in each section (one child for each section).

Where the business name and address are mentioned (in indexable text) on another website. Business Name, Address and Phone (NAP) are considered the information that would complete a full citation, however business name and address, or business name and phone are also considered a citation.

Click Bot
A program used to artificially click on paid listings to inflate click amounts.

Click Fraud
Clicks on a Pay-Per-Click advertisement that are motivated by something other than a search for the advertised product or service. Click fraud may be the result of malicious or negative competitor/affiliate actions motivated by the desire to increase costs for a competing advertiser or to garner click-through costs for the collaborating affiliate. Also affects search engine results by diluting the quality of clicks.

A program, process or computer which requests information to another computer, process, or program.

Displaying different content to search engines and searchers. Depending on the intent of the display discrepancy and the strength of the brand of the person / company cloaking it may be considered reasonable or it may get a site banned from a search engine.
Cloaking has many legitimate uses which are within search guidelines. For example, changing user experience based on location is common on many popular websites.

Cluetrain Manifesto, The
A book about how the web is a marketplace and its differences from traditional offline business.

Listings from any individual site search are limited to a certain number and grouped together to make the results appear organized and to ensure diversity amongst the top ranked results.

(aka) Content Management System – A tool used to help make it easy to update and manage the content on a a website.

In popular authority based search algorithms links which appear near one another on a page may be deemed to be related to one another.

COA (Cost of Acquisition)
How much it costs to acquire a desired action.

code swapping
Changing the content after high ranking is achieved.

Comments Tag
Comments are placed by web developers in the source code of their work to help make it easy for people to understand the code.

Compacted Information
Information which is generally and widely associated with a product. For example, most published books have an ISBN.

Competitive Analysis
The assessment and analysis of strengths and weaknesses of competing web sites, including identifying keyword selection, traffic patterns and major traffic sources.

Concept Search
A search which attempts to match results with the query’s concept, not with the words.

Conceptual Links
Links which search engines attempt to understand beyond just the words in them.

Consumer Generated Media
(aka) CGM – Posts made by consumers to support or oppose products, web sites or companies.

content (text, copy)
The part of a web page that has value and is of interest to the user. Advertising, navigation, branding and boilerplate are not usually considered to be content.

Content Management Systems (CMS)
Software that is used to create and manage the content for a web site. It provides for the storage, maintenance and retrieval of HTML and XML documents and all related image, audio and video files.

Content Network (aka Contextual Networks)
Including Google and Yahoo! these networks serve paid search ads triggered by keywords related to the page content a user is viewing.

Content Targeting
An ad serving process in Google and Yahoo! which displays keyword triggered ads related to the content of the web site a user is viewing.

Contextual Advertising
Advertising that is served or placed automatically on a web page based on the page’s content, keywords and phrases.

Contextual Distribution
The marketing decision to display search ads on certain
publisher sites across the web instead of, or in addition to, placing PPC ads on search

Contextual Link Inventory
Contextual or content inventory is generated when listings are displayed on pages of web sites (usually not search engines), where the written content on the page indicates to the ad-server that the page is a good match to specific keywords and phrases.

Contextual Network
Including Google and Yahoo! these networks serve paid search ads triggered by keywords related to the page content a user is viewing.

Contextual Search
A search offered by Google and Yahoo! which analyzes the page being viewed by a user gives a list of related search results.

Contextual Search Campaigns
A paid placement search campaign that takes a search ad listing beyond search engine results pages and onto the sites of matched content web partners.

A conversion is reached when a desired goal is completed

Conversion Action
The desired action you want a visitor to take on your site.

Conversion Rate
The number of visitors who convert (take a desired action at your site) after clicking through on your ad, divided by the total number of click-throughs to your site for that ad. (Expressed as: total click-throughs that convert / total clickthroughs for that ad = conversion rate.) For example, if an ad brings in 150 clickthroughs and 6 of the 150 clicks result in a desired conversion, then the conversion rate is 4% (6 / 150 = 0.04). Higher conversion rates generally translate into more successful PPC campaigns with a better ROI. Typically, micro-conversions (for instance, reading different pages on your site) lead to your main conversion step (making a purchase, or signing up for a service).

Small data file written to a user’s machine to track them and to customize the user’s experience. Cookies also help affiliate program managers track conversions.

The legal right granted to an author, composer, playwright, publisher, or distributor to exclusive publication, production, sale, or distribution of a literary, musical, dramatic, or artistic work.

(aka) Cost Per Acquisition or Cost Per Action – The total cost of an ad campaign divided by the number of conversions.

CPA Network
CPA networks are often so-called “super affiliates” who are themselves affiliates of merchants via the traditional affiliate networks and recruit other affiliates to promote the merchant through them instead of directly via the merchants program at the traditional network.CPA networks take advantage of the ability to get higher commission rates due to their high volume, which they pass in part down to their affiliates. Average affiliates usually get paid a lower commission if they promote the merchant directly, because they are rarely able to generate the required volume to reach the higher payout tiers.

(aka) Cost Per Click – The amount search engines charge advertisers for every click that sends a searcher to the advertiser’s web site.

(aka) Cost Per Thousand Impressions – A unit of measure typically assigned to the cost of displaying an ad.

CPO (Cost Per Order)
The dollar amount of advertising or marketing spent to make a sale. Calculated by dividing marketing expenses by the number of orders.

Crawl Depth
How deeply a website is crawled and indexed.

Crawl Frequency
How frequently a website is crawled by a search engine spider or bot.

A program that searches for information on the Web. Crawlers are widely used by Web search engines to index all the pages on a site by following the links from page to page. The search engine then summarizes the content and adds the links to their indexes. They are also used to locate Web pages that sell a particular product or to find blogs that have opinions about a product.

Unique words, design and display of a paid-space advertisement. In paid search advertising, creative refers to the ad’s title (headline), description (text offer) and display URL (clickable link to advertiser’s web site landing page).

CTR (Click-Through Rate)
The number of clicks that an ad gets, divided by the total number of times that ad is displayed or served. CTR also factors into you advertiser search engine Quality Score and, therefore, your minimum keyword bids on Tier I engines.

Custom Feed
Create custom feeds for each of the shopping engines that allow you to submit XML feeds. Each of the engines has different product categories and feed requirements.

Cutts, Matt
Head of Search Quality at Google.

Registering an Internet domain name for the purpose of reselling it for a profit.

Turning ad campaigns on or off, changing ad bid price, or budget constraints based on bidding more when your target audience is available and less when they are less likely to be available.

Temporarily or permanently becoming de-indexed from a directory or search engine.
De-indexing may be due to any of the following:
Pages on new websites (or sites with limited link authority relative to their size) may be temporarily de-indexed until the search engine does a deep spidering and re-cache of the web.
During some updates search engines readjust crawl priorities.
You need a significant number of high quality links to get a large website well indexed and keep it well indexed.
Duplicate content filters, inbound and outbound link quality, or other information quality related issues may also relate to re-adjusted crawl priorities.
Pages which have changed location and are not properly redirected, or pages which are down when a search engine tries to crawl them may be temporarily de-indexed.
Search Spam:
If a website tripped an automatic spam filter it may return to the search index anywhere from a few days to a few months after the problem has been fixed.
If a website is editorially removed by a human you may need to contact the search engine directly to request reinclusion.

Dead Link
A link which is no longer functional.

Dedicated Server
Server which is limited to serving one website or a small number of websites owned by a single person.

Deep Link
A link which points to an internal page within a website.

Deep Link Ratio
The ratio of links pointing to internal pages to overall links pointing at a website. A high deep link ratio is typically a sign of a legitimate natural link profile.

Deep Linking
Linking that guides, directs and links a click-through searcher (or a search engine crawler) to a very specific and relevant product or category web page from search terms and PPC ads.

A popular social bookmarking website.

Statistical data or characteristics which define segments of a population.

Denton, Nick
Publisher of Gawker, a popular ring of topical weblogs, which are typically focused on controversy.

Description Tag
Refers to the information contained in the description meta tag.

(aka) Dynamic HTML or Dynamic Hypertext Markup Language – A collection of technologies used together to create interactive and animated web sites by using a combination of a static markup language (such as HTML), a client-side scripting language (such as JavaScript), a presentation definition language (such as CSS), and the Document Object Model.

A Social news site where users vote on which stories get the most exposure and become the most popular.

A site devoted to directory pages. The Yahoo directory is an example.

Display URL
The web page URL that one actually sees in a PPC text ad. The Display
URL usually appears as the last line in the ad; it may be a simplified path for the longer
actual URL, which is not visible.

Distribution Network
A network of web sites or search engines and their partner sites on which paid ads can be distributed.

DKI (Dynamic Keyword Insertion)
Insertion of the EXACT keywords a searcher included in his or her search request in the returned ad title or description.

“The Digital Millennium Copyright Act (DMCA) is a United States copyright law which criminalizes production and dissemination of technology, devices, or services that are used to circumvent measures that control access to copyrighted works (commonly known as DRM), and criminalizes the act of circumventing an access control, even when there is no infringement of copyright itself. [Circumvention of controlled access includes unscrambling, copying, sharing, commercial recording or reverse engineering copyrighted entertainment or software.] It also heightens the penalties for copyright infringement on the Internet.” (Source: Wikipedia)

The Open Directory Project is the largest human edited directory of websites. DMOZ is owned by AOL, and is primarily run and maintained by volunteer editors.

(aka)Domain Name Server or Domain Name System – A naming scheme mechanism used to resolve a domain name / host name to a specific TCP/IP Address.

dofollow (see nofollow)
Refers to standard webpage links that search engine bots typically follow when looking for pages to add or keep in their index. The opposite of nofollow.

Refers to a specific web site address.

Doorway Pages
The idea is that these pages are designed to rank for highly targeted search queries and are a “doorway” into your site. Modern doorway pages are also now known as ‘information pages’ or ‘landing pages’. However this concept has been abused and the result is that it’s taken on a negative meaning in the eyes of the search engines.Google goes on to define “doorway pages” as the following:
“Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination.Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our webmaster guidelines.”http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=66355

Popular web development and editing software offering a ‘what you see is what you get’ (WYSIWYG) interface.

Duplicate Content
Content which is duplicate or near duplicate in nature.

Dynamic Content
Content which changes over time or uses a dynamic language such as PHP to help render the page.

Dynamic Landing Pages
Dynamic landing pages are web pages to which clickthrough searchers are sent that generate changeable (not static) pages with content specifically relevant to the keyword search.

Dynamic Languages
Programming languages such as ASP or PHP which build web pages on the fly upon request.

Dynamic Text Insertion
(aka) Dynamic Keyword Insertion – This is text, a keyword or ad copy that customizes search ads returned to a searcher by using parameters to insert the desired text somewhere in the title or ad. When the search query (for example, ‘running shoes’) matches the defined parameter (for example, all brands of running and athletic shoes),
then the associated term (running) is plugged into the ad. Dynamic insertion makes the
ad mirror exact terms used in the search query, creating very relevant ads.

Dynamic URLs
A URL that is generated either by searching a database-driven website or by a website that is running a script. The opposite would be static URLs, where the contents of the webpage do not change unless changes are made to the actual HTML source code. Since dynamic URLs are generated from specific queries to a site’s database, the webpage itself is merely a template designed to display the results of the query. That means that, instead of changing information in the HTML source code, data is changed within the database itself. Dynamic URLs often contain the following characters: ?, &, %, +, =, $, cgi-bin, .cgi.

Earnings Per Click
Estimate of potential earnings based on how much is made from each click.

Commerce that is transacted electronically, as over the Internet.

eCPM (Effective Cost Per Thousand)
a hybrid Cost-Per-Click (CPC) auction calculated by multiplying the CPC times the click-through rate (CTR), and multiplying that by one thousand. (Represented by: (CPC x CTR) x 1000 = eCPM.) This monetization model is used by Google to rank site-targeted CPM ads (in the Google content network) against keyword-targeted CPC ads (Google AdWords PPC) in their hybrid auction.

Editorial Link
Links that are earned. Search engines count links as votes of quality. They primarily want to count editorial links over links that were bought or bartered.

Editorial Review Process
A review process for potential advertiser listings conducted by search engines, which check to ensure relevancy and compliance with the engine’s editorial policy.

Entry Page
Refers to any page within a web site that is used to ‘center’ a web site.

Ethical SEO
Search engines like to paint SEO services which manipulate their relevancy algorithms as being unethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.
Some search marketers lacking in creativity tend to describe services sold by others as being unethical while their own services are ethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.
The only ethics issues associated with SEO are generally business ethics related issues. Two of the bigger frauds are
Not disclosing risks: Some SEOs may use high risk techniques when they are not needed. Some may make that situation even worse by not disclosing potential risks to clients.
Taking money & doing nothing: Since selling SEO services has almost no start up costs many of the people selling services may not actually know how to competently provide them. Some shady people claim to be SEOs and bilk money out of unsuspecting small businesses.
As long as the client is aware of potential risks there is nothing unethical about being aggressive.

Major search indexes are constantly updating. Google refers to this continuous refresh as everflux.

Expert Document
A quality page which links to many non-affiliated topical resources.

External Link
A link which references another domain.

Eye Tracking Studies
In order to understand reading and click-through patterns, studies are conducted by Google, Marketing Sherpa and the Poynter Institute using Eyetools technology to track the eye movements of web page readers.

Fair Use
The stated exceptions of allowed usage of work under copyright without requiring permission of the original copyright holder. Fair use is explained in section 107 of the Copyright code.

FAQ (Frequently Asked Questions)

Favicon (Favorite Icons)
A small icon which appears next to URLs in a web browser.

Content management systems such as blogs, allow readers to subscribe to content update notifications via RSS or XML feeds. Feeds can also refer to pay per click syndicated feeds, or merchant product feeds. Merchant product feeds have become less effective as a means of content generation due to improving duplicate content filters.

Feed Reader
Software or a website used to subscribe to feed update notifications

A web document that is a shortened or updated (revised content only) version
of a web page created for syndication.

Free for all pages are pages that allow anyone to add a link to them. These links do not pull much weight in search relevancy algorithms as automated programs fill these pages with links pointing at low quality websites.

Certain activities or signatures which make a site or page appear unnatural might make search engines inclined to filter or even remove them out of the search results.

Popular extensible open source web browser.

A vector graphics-based technology that has become a popular method for adding animation and interactivity to web pages.
(Source: Wikipedia)

Follow Fail
A Twitter term by which your attempt to follow a user has been denied by that user for some undisclosed reason. Follow fails can result from a lack of shared interests, an incomplete Twitter profile on your part, or a host of other reasons.

A style of website architecture that is used to display, aka, ‘frame’ multiple webpages within a single displayed webpage. One advantage is that it allows for consistent site navigation, however it can be problematic to search engines when indexing a website’s content. Therefore, if a webpage is reliant upon search engine indexibility, then the use of Frames is not recommended.

frequent crawling
Frequently updated websites are more likely to be crawled frequently.

Fresh Content
Content which is dynamic in nature and gives people a reason to keep paying attention to your website.
Many SEOs talk up fresh content, but fresh content does not generally mean re-editing old content. It more often refers to creating new content. The primary advantages to fresh content are:
Maintain and grow mindshare: If you keep giving people a reason to pay attention to you more and more people will pay attention to you, and link to your site.
Faster idea spreading: If many people pay attention to your site, when you come out with good ideas they will spread quickly.
Growing archives: If you are a content producer then owning more content means you have more chances to rank. If you keep building additional fresh content eventually that gives you a large catalog of relevant content.

File Transfer Protocol

FTP (File Transfer Protocol)
a protocol for transferring data between computers.
Many content management systems (such as blogging platforms) include FTP capabilities. Web development software such as Dreamweaver also comes with FTP capabilities. There are also a number of free or cheap FTP programs such as Cute FTP, Core FTP, and Leech FTP.

Fuzzy Search
A search which will find matching terms when terms are misspelled (or fuzzy). Fuzzy search corrects the misspellings at the users end.

GAP (Google Advertising Professional)
A program which qualifies marketers as being proficient AdWords marketers.

Gateway page
A web page that is created to attract traffic from a search engine and then redirect it to another site or page.

Geo-targeting allows you to specify where your ads will or won’t be shown based on the searcher’s location, enabling more localized and personalized results.

Gladwell, Malcolm
Popular author who wrote the book titled The Tipping Point.

Godin, Seth
Popular blogger, author, viral marketer and business consultant.

The world’s leading search engine in terms of reach. Google pioneered search by analyzing linkage ata via PageRank. Google was created by Stanford students Larry Page and Sergey Brin.

Google AdSense (see AdSense)

Google AdWords (see AdWords)

Google Base
Free database of semantically structured information created by Google.

Google bomb
The combined effort of multiple webmasters to influence the Google search results usually for humorous effect. The search results for: ‘miserable failure’ – George Bush, and ‘greatest living American’ – Steven Colbert are the more famous examples of a Google Bomb.

Google Bombing
Making a prank rank well for a specific search query by pointing hundreds or thousands of links at it with the keywords in the anchor text.

Google Bowling
Knocking a competitor out of the search results by pointing hundreds or thousands of low trust low quality links at their website. Typically it is easier to bowl new sites out of the results. Older established sites are much harder to knock out of the search results.

Google Checkout
Payment service provided by Google which helps Google better understand merchant conversion rates and the value of different keywords and markets

Google Dance
In the past Google updated their index roughly once a month. Those updates were named Google Dances, but since Google shifted to a constantly updating index, Google no longer does what was traditionally called a Google Dance. Major search indexes are constantly updating. Google refers to this continuous refresh as everflux. The second meaning of Google Dance is a yearly party at Google’s corporate headquarters which Google holds for search engine marketers. This party coincides with the San Jose Search Engine Strategies conference.

Google Keyword Tool
Keyword research tool provided by Google’s Adwords Service which estimates the competition for a keyword, recommends related keywords, and will tell you what keywords Google thinks are relevant to your site or a page on your site. Also known as the Google Adwords Sandbox.

Google OneBox
Portion of the search results page above the organic search results which Google sometimes uses to display vertical search results from Google News, Google Base, and other Google owned vertical search services.

Google Sitelinks
On some search results where Google thinks one result is far more relevant than other results (like navigational or brand related searches) they may display additional deep links to that site at the top of the search results.

Google Sitemaps
A program which webmasters can use to help Google index their contents using XML Sitemaps.
Please note that the best way to submit your site to search engines and to keep it in their search indexes is to build high quality editorial links.

Google Supplemental Index
Index where pages with lower trust scores are stored. Pages may be placed in Google’s Supplemental Index if they consist largely of duplicate content, if the URLs are excessively complex in nature, or the site which hosts them lacks significant trust. Google no longer displays if a URL is listed in the Supplemental index.

Google Traffic Estimator
Tool which estimates bid prices and how many Google searchers will click on an ad for a particular keyword.

Google Trends
Tool which allows you to see how Google search volumes for a particular keyword change over time.

Google Website Optimizer
Free multi variable testing platform used to help AdWords advertisers improve their conversion rates.

Google’s search engine spider. Google has a shared crawl cache between their various spiders, including vertical search spiders and spiders associated with ad targeting.

Graphical Search Inventory
Banners, pop-ups, browser toolbars, rich media and other types of advertising that can be synchronized to search keywords.

Guestbook Spam
A type of low quality automated link which search engines do not want to place much trust on. Spammers use automated tools that automatically fill in guestbooks, and blog comments.

GUI (Graphical User Interface)
A way for the average user to interface with a database or program. A visual representation of the functional code.

Google – Yahoo – Microsoft, the big three of search

Hallway page
a page that serves as an index to a group of pages that you’d like search engine spiders to find. Once a search engine spider indexes the hallway page, it will also follow all the links on that page and in turn index those pages as well.

Head Terms
Search terms that are straightforward, short and popular.

Heading tag
An HTML tag that is often used to denote a page or section heading on a web page. Search engines pay special attention to text that is marked with a heading tag, as such text is set off from the rest of the page content as being more important and is used to help the search engines identify what a page is about. Your desired keywords should be in heading tags if you want them to be noticed by the search engines.

The heading element briefly describes the subject of the section it introduces. Heading elements go from H1 to H6 with the lower numbered headings being most important. You should only use a single H1 element on each page, and may want to use multiple other heading elements to structure a document. An H1 element source would look like:
<h1>Your Topic</h1>
Heading elements may be styled using CSS. Many content management systems place the same content in the main page heading and the page title, although in many cases it may be preferential to mix them up if possible.

The title of an article or story.

Hidden Keywords
Keywords that are placed in the HTML source in such a way that these words are not visable to the users looking at the rendered web page.

Hidden Text
Text that is visible to the search engines but hidden to a user used for the purpose of including extra keywords in the page without distorting the aesthetics of the page. A common way to hide keywords is to use white text against a white background.

Hidden Text
SEO technique used to show search engine spiders text that users do not see. While some sites may get away with it for a while, generally the risk to reward ratio is inadequate for most legitimate sites to consider using hidden text.

Hijacking of websites is a practice that makes search engines believe that a specific website resides at another URL. Webpage Hijacking is an illegal spam tactic which is typically accomplished by using techniques such as a 302 redirect or meta refresh.

An algorithm which ranks results largely based on unaffiliated expert citations.

The request or retrieval of any item located within a web page. For example, if a user enters a web page with 5 pictures on it, it would be counted as 6 ‘hits’. One hit is counted for the web page itself, and another 5 hits count for the pictures.

A link-based algorithm which ranks relevancy scores based on citations from topical authorities.

Home Page
One’s personal billboard on the internet. The term ‘home page’ is perhaps a bit misleading because home directories and physical homes in RL are private, but home pages are designed to be very public.
The Home Page is largely responsible for helping develop your brand and setting up the navigational schemes that will be used to help users and search engines navigate your website.

aka, hot link, hotlinking, inline linking, leeching, direct linking, and offsite image grabs; refers to the practice of using a linked object–usually an image, but it could also be a document–that resides on one website for display on another, external and unrelated website. This practice is typically frowned upon and considered to be a theft of bandwidth.

HTML (Hyper Text Markup Language)
A markup language used to structure text and multimedia documents and to set up hypertext links between documents, used extensively on the internet. HTML is the mother tongue of the search engines, and should generally be strictly and exclusively adhered to on web pages.

HTML sitemap
An on-site Web page that links to all the other pages on your Web site. It ensures that any spider crawling your site can easily and quickly find and index all of your site’s Web pages. This type of sitemap is for spiders first and foremost but it can also be useful for Web site visitors.

HTML Source
Programming code. It can be accessed in Internet Explorer by going to the “View” menu then selecting “Source”.

HyperText Transfer Protocol is used to request and transmit files, especially webpages and webpage components, over the Internet or other computer network.

HTTP 301 – Status Code Definition
The 301 status code means the URL requested has ‘Moved Permanently’ and has been assigned a new URL.

HTTP 302 – Status Code Definition
The 302 status code means that the document requested is ‘Found’ however temporarily resides under a different URL.

HTTP 400 – Status Code Definition
The 400 status code means a ‘Bad Request’ stating that the server is not able to understand the document request due to a malformed syntax. The user has to modify their request.

HTTP 401 – Status Code Definition
The 401 status code means ‘Unauthorized’. This server requests user authentication prior to fulfilling the document request.

HTTP 403 – Status Code Definition
The 403 status code means ‘Forbidden’. The server understood the request, however is refusing to fulfill it. The webmaster may wish to alert the user why their request has been denied. If the organization does not wish to provide this reason then a 404 (Not Found) status code can be displayed instead.

HTTP 404 – Status Code Definition
The response error message ‘404’ – represents a document ‘Not Found’. This means that the user was able to communicate with the server, however could not find the requested document. Alternatively, the server could be configured to not fulfill the request and not provide a reason why.

HTTP 410 – Status Code Definition
Similar to a 404 Not Found error message, the 410 status code states that the requested document is ‘intentionally gone’. This basically means that it’s no longer available and there is no forwarding address.

HTTP 500 – Status Code Definition
The 500 status code error message states that there was an internal server error which has prevented the document from being fulfilled

HTTP 501 – Status Code Definition
The 501 status code message is displayed when the server does not recognize the document request method. The server is not capable of fulfilling this request and states the request was ‘Not Implemented’.

HTTP Referrer Data
A program that analyzes and reports the source of traffic to the user’s web site. The HTTP referrer allows webmasters, site owners and PPC advertisers to uncover new audiences or sites to target or to calculate conversions and ROI for future ad campaigns.

HTTPS (Hypertext Transfer Protocol Secure)

hub (expert page)
A trusted page with high quality content that links out to related pages.

Sites which link to well trusted within their topical community, a page which references many authorities.

Inverse Document Frequency is a term used to help determine the position of a term in a vector space model.

IDF = log
total documents in database / documents containing the term

One view or display of an ad. Ad reports list total impressions per ad,
which tells you the number of times your ad was served by the search engine when
searchers entered your keywords (or viewed a content page containing your keywords).

Inbound Link
Link pointing to one website from another website.

The collection of information a search engine has that searchers can query against. With crawler-based search engines, the index is typically copies of all the web pages they have found from crawling the web. With human-powered directories, the index contains the summaries of all web sites that have been categorized.

Indexability (crawlability and spiderability)
Indexability is the potential of a web site or its contents to be crawled or ‘indexed’ by a search engine.

Information Architecture
Designing, categorizing, organizing, and structuring content in a useful and meaningful way. Good information architecture considers both how humans and search spiders access a website. Information architecture suggestions:
focus each page on a specific topic
use descriptive page titles and meta descriptions which describe the content of the page
use clean (few or no variables) descriptive file names and folder names
use headings to help break up text and semantically structure a document
use breadcrumb navigation to show page relationships
use descriptive link anchor text
link to related information from within the content area of your web pages
improve conversion rates by making it easy for people to take desired actions
avoid feeding search engines duplicate or near-duplicate content

Information Retrieval
The technique and process of searching, recovering, and interpreting information from large amounts of stored data.

A search engine which pioneered the paid inclusion business model. Inktomi was bought by Yahoo! at the end of 2002.

Internal Link
A hyperlink on a Web page that points to a page on the same Web site.

An interconnected system of networks that connects computers around the world via the TCP/IP protocol.

Internet Explorer
Microsoft’s web browser. Internet Explorer is the most widely used Web browser on the market. It has also been the browser engine in AOL’s Internet access software.

Invisible Web
Content on the Web that is not found in most search engine results, because it is stored in a database rather than on HTML pages. Viewing such content is accomplished by going to the web site’s search page and typing in specific queries.

IP Address
The address of a device attached to an IP network (TCP/IP network). Every client, server and network device must have a unique IP address for each network connection (network interface). Every IP packet contains a source IP address and a destination IP address

ISP (Internet Service Providers)
ISPs sell Internet access to the mass market. While the big nationwide commercial BBSs with Internet access (like America Online, CompuServe, GEnie, Netcom, etc.) are technically ISPs, the term is usually reserved for local or regional small providers (often run by hackers turned entrepreneurs) who resell Internet access cheaply without themselves being information providers or selling advertising.

Java applets
Small programs written in the Java programming language that can be embedded into web pages.

A scripting language that is added to standard HTML to create interactive documents.

Jump Page Ad
A microsite reached by clicking a button or banner. The jump page itself can list several topics, which can link to your site.

junk pages
meaningless documents that serve no purpose other than to spam the search engines with keyword stuffed pages in hopes a visitor might click on an adsense ad

Key Performance Indicators (KPI)
Metrics used to quantify objectives that reflect the strategic performance of your online marketing campaigns. They provide business and marketing intelligence to assess a measurable objective and the direction in which that objective is headed.

Key phrase (or keyword phrase)
A search phrase made up of keywords.

A word that a search engine user uses to find relevant web page(s). If a keyword doesn’t appear anywhere in the text of your web page, it’s highly unlikely your page will appear in the search results (unless of course you have bid on that keyword in a pay-per-click search engine).

Keyword cannibalization
The excessive reuse of the same keyword on too many web pages within the same site. This practice makes it very difficult for the users and the search engines to determine which page is most relevant for the keyword.

Keyword Density
The percentage of words on a web page that match a specified set of keywords. In the context of search engine optimization keyword density can be used as a factor in determining whether a web page is relevant to a specified keyword or keyword phrase. In general, the higher the number of times a keyword appears in a page, the higher its density.

Keyword Funnel
The relationship between various related keywords. Some searches are particularly well aligned with others due to spelling errors, poor search relevancy, and automated or manual query refinement.

Keyword Matching
Keyword matching is the process of selecting and providing advertising or information that match the user’s search query.

Keyword Prominence
The placement of a given keyword in the HTML source code of a web page. The higher up in the page a particular word is, the more prominent it is and the more weight that word is assigned by the search engine’s. It’s best to have your first paragraph include your important keywords rather than using superfluous marketingspeak. This concept also applies to the location of important keywords within individual HTML tags, such as heading tags, title tags, or hyperlink text. Start off your HTML title tags with your chosen keywords instead of “Welcome to.”

Keyword Research
The process of discovering relevant keywords and keyword phrases to focus your SEO and PPC marketing campaigns on.

Keyword Research Tools
Tools which help you discover potential keywords based on past search trends, search volumes, bid prices, and page content from related websites.

Keyword Stemming
To return to the root or stem of a word and build additional words by adding a prefix or suffix, or using pluralization. The word can expand in either direction and even add words, increasing the number of variable options.

Keyword stuffing
Placing excessive amounts of keywords into the page copy and the HTML in such a way that it detracts from the readability and usability of a given page for the purpose of boosting the page’s rankings in the search engines.

Keyword Tag
Refers to the meta keywords tag within a web page. This tag is meant to hold approximately eight to ten keywords or keyword phrases, separated by commas. These phrases should be either misspellings of the main page topic, or terms that directly reflect the content on the page on which they appear. Keyword tags are sometimes used for internal search results as well as viewed by search engines.

Keyword Targeting
Displaying Pay Per Click search ads on publisher sites across the Internet that contain the keywords in a context advertiser’s Ad Group.

Kleinberg, Jon
Scientist largely responsible for much of the research that went into hubs and authorities based search relevancy algorithms.

Landing Page
The landing page is a web page where people go to (i.e., land on) after clicking an online advertisement or a link in the search results.

Landing Page Quality Scores
A measure used by Google to help filter noisy ads out of their AdWords program.

Latent Semantic Indexing (LSI)
LSI uses word associations to help search engines know more accurately what a page is about.

Lead Generation
This market concept describes the derivation of income from ads, fees for delivering leads to suppliers, or sales commissions. Functioning in a primarily seller-driven market, lead generation markets may also produce RFP’s (Requests For Proposals), and RFQ (Requests For Quotes) for buyers. In the process, the information needs of users are evaluated while content, information, and transactions for buyers and sellers are integrated/aggregated. The majority of lead generation markets strive to become transaction-oriented catalog aggregation models.

An address that points to a Web page or other file (image, video, PDF, etc.) on a Web server. Links reside on Web pages, in e-mail messages and word processing documents as well as any other document type that supports hypertext and URL addressing.

Link Baiting
The art of targeting, creating, and formatting information that provokes the target audience to point high quality links at your site.

Link building
Designing a Web site so that search engines easily find the pages and index them. The goal is to have your page be in the top 10 results of a search. Optimization includes the choice of words used in the text paragraphs and the placement of those words on the page, both visible and hidden inside meta tags.

Link Building
The process of building high quality linkage data that search engines will evaluate to trust your website is authoritative, relevant, and trustworthy.
A few general link building tips:
build conceptually unique linkworthy high quality content
create viral marketing ideas that want to spread and make people talk about you
mix your anchor text
get deep links
try to build at least a few quality links before actively obtaining any low quality links
register your site in relevant high quality directories such as DMOZ, the Yahoo! Directory, and Business.com
when possible try to focus your efforts mainly on getting high quality editorial links
create link bait
try to get bloggers to mention you on their blogs
It takes a while to catch up with the competition, but if you work at it long enough and hard enough eventually you can enjoy a self-reinforcing market position

Link Bursts
A rapid increase in the quantity of links pointing at a website. Link development frequency, or the rate a site develops links is likely used to detect spam or sites that are attempting to game a search engine.
When links occur naturally they generally develop over time. In some cases it may make sense that popular viral articles receive many links quickly, but in those cases there are typically other signs of quality as well, such as:
increased usage data
increase in brand related search queries
traffic from the link sources to the site being linked at
many of the new links coming from new pages on trusted domains

Link Churn
The rate at which a site loses links.

Link Equity
A measure of how strong a site is based on its inbound link popularity and the authority of the sites providing those links.

Link exchange
A reciprocal linking scheme often facilitated by a site devoted to directory pages. Link exchanges usually allow links to sites of low or no quality, and add no value themselves. Quality directories are usually human edited for quality assurance.

Link Farm
Website or group of websites which exercises little to no editorial control when linking to other sites. FFA pages, for example, are link farms. ) A link farm is a form of spamdexing, spamming the index of a search engine.

Link Hoarding
A method of trying to keep all your link popularity by not linking out to other sites, use of the nofollow tag, or linking out using JavaScript or through cheesy redirects.
Generally link hoarding is a bad idea for the following reasons:
many authority sites were at one point hub sites that freely linked out to other relevant resources
if you are unwilling to link out to other sites people are going to be less likely to link to your site
outbound links to relevant resources may improve your credibility and boost your overall relevancy scores

Link partner (link exchange, reciprocal linking)
Two sites which link to each other.

Link popularity
The number of links pointed at a website. When other web sites link to your site, your site will rank better in certain search engines. The more web pages that link to you, the better your link popularity.

Link Rot
A measure of how many links are broken on a website.

Link Spam (Comment Spam)
Unwanted links such as those posted in user generated content like blog comments.

Link Text (Anchor text)
The user-visible text of a link. Search engines use anchor text to indicate the relevancy of the referring site and link to the content on the landing page. Ideally all three will share some keywords in common.

Linking Profile
A profile is a representation of the extent to which something exhibits
various characteristics. A linking profile is the results of an analysis of where of your
links are coming from.

The information appearing on a results page in response to a search.

New search platform provided by Microsoft.

Log file
A file which lists all actions that have occurred on a server. Usually logged data includes date and time, filename accessed, user’s IP address, referring web page, user’s browser software and version, and cookie data.

Long Tail
The potential for online retailers to make more money than their bricks and mortar counterparts because there is virtually unlimited “shelf space” to offer products. Another key factor is that merchandise is offered via recommendations with links from one product to another so that people who purchase one item are encouraged to look at several others.

Long Tail Keywords
Keyword phrases with three to five words in them. These long tail keywords are usually highly specific and draw lower traffic than shorter, more competitive keyword phrases, which is why they are also cheaper. Oftentimes, long tail keywords, in aggregate, have good conversion ratios for the low number of click-throughs they generate.

Company originally launched as a directory service which later morphed into a paid search provider and vertical content play.

Auto-generated doorway pages, which are usually devoid of meaningful content. Google, in particular, is working on ways to identify and exclude machine-generated doorway pages.

Mahaney, Stephen
Founder of Planet Ocean and editor of SearchEngineNews.com; Author of the original ‘UnFair Advantage Book on Winning The Search Engine Wars’

Malda, Rob
Founder of Slashdot.org, a popular editorially driven technology news forum.

Manual Review
A manual review process is preformed with automated relevancy algorithms to help catch search spam and train relevancy algorithms. Abnormal usage data or link growth patterns may also flag sites for manual review. Google for example has a small army of employees that review the quality of search results and web sites.

Manual submitting
Submitting by hand to an individual search engine, rather than using an automated submission tool or service. Manual submitting is a more polite way to submit, and is less likely to cause trouble with the search engines. Better yet, do not submit at all — let the search engine spiders find you through links from other sites.

Marketing Sherpa
A research firm located in Warren, RI specializing in tracking what works in all aspects of marketing, including SEO and SMO. They provide yearly benchmark guides on a variety of topics for a fee through their website.

A web page which consists primarily of single purpose software and other small programs (gizmos and gadgets) or possibly links to such programs. Mashups are quick and easy content to produce and are often popular with users, and can make good link bait. Tool collection pages are sometimes mashups.

Mechanical Turk
Performing human tasks interspersed within an automated system. An Amazon.com program which allows you to hire humans to perform easy tasks that computers are bad at.

In The Selfish Gene Richard Dawkins defines a meme as “a unit of cultural transmission, or a unit of imitation.” Many people use the word meme to refer to self spreading or viral ideas.

Meta Description
A description of the data in a source, distinct from the actual data; for example, the currency by which prices are measured in a data source for purchasing goods
The meta description tag is typically a sentence or two of content which describes the content of the page.
A good meta description tag should:
be relevant and unique to the page;
reinforce the page title; and
focus on including offers and secondary keywords and phrases to help add context to the page title.
Relevant meta description tags may appear in search results as part of the page description below the page title.

Meta Description Tag
An HTML tag that identifies the contents of a Web page for the search engines. Meta tags are hidden on the page, but they, as well as all the HTML code on a page, can be viewed by selecting View/Source or View/Page Source from the browser menu. Meta tags contain a general description of the page, keywords and copyright information.

Meta Feeds
Ad networks which pull advertiser listings from other providers. They may or may not have their own distribution and advertiser networks.

Meta Keywords
The meta keywords tag is a tag which can be used to highlight keywords and keyword phrases which the page is targeting.The code for a meta keyword tag looks like this
<meta name=”Keywords” content=”keyword phrase, another keyword, yep another, maybe one more “>
Many people spammed meta keyword tags and searchers typically never see the tag, so most search engines do not place much (if any) weight on it. Many SEO professionals no longer use meta keywords tags.
See also:
Free meta tag generator – offers a free formatting tool and advice on creating meta description tags.

Meta Keywords Tag
Allows page authors to add text to a page to help with the
search engine ranking process. Not all search engines use the tag.

Meta Refresh
An HTML command that switches you to a different Web page within a specified amount of time. It is used to briefly display an outdated page and send the visitor to the new page. Fast meta refreshes are used to quickly switch doorway pages to the page the user is supposed to see.

Meta Refresh Redirect
A client-side redirect.

Meta Robots Tag
Allows page authors to keep their web pages from being indexed by search engines, especially helpful for those who cannot create robots.txt files.

Meta Search
Top ranked results from multiple search engines and rearranged them into a single SERP.

Meta Search Engine
A search engine that gets listings from two or more other search engines, rather than through its own efforts.

Meta tag stuffing
Repeating keywords in the meta tags and using meta keywords that are unrelated to the site’s content.

Meta Tags
Information placed in a web page not intended for users to see but instead which typically passes information to search engine crawlers, browser software and some other applications.

MFA Made For Advertisements
websites that are designed from the ground up as a venue for advertisements.

A form of multimedia blogging that allows users to send brief text messages (usually 140 characters or less), pictures, audio files and even videos to a group of people who have subscribed to be sent these updates. These messages can be sent from a phone’s text messaging option, instant messaging, email, digital audio or straight from the Micro-blogging service. An example of a popular micro-blogging service is http://www.twitter.com

Maker of the popular Windows operating system and Internet Explorer browser.

A measure of the amount of people who think of you or your product when thinking of products in your category. Sites with strong mindshare, top rankings, or a strong memorable brand are far more likely to be linked at than sites which are less memorable and have less search exposure. The link quality of mindshare related links most likely exceeds the quality of the average link on the web. If you sell non-commodities, personal recommendations also typically carry far greater weight than search rankings alone.

Mirror Site
Site which mirrors (or duplicates) the contents of another website. Most often seen in situations to where sits are located in different parts of the world to speed downloading, or to share processing. Download sites frequently are mirrored in different locations. Generally search engines prefer not to index duplicate content. The one exception to this is that if you are a hosting company it might make sense to offer free hosting or a free mirror site to a popular open source software site to build significant link equity.

A module or plugin for Apache web servers that can be used to rewrite a dynamic URL as a static URL on the fly. This is commonly done for SEO purposes to improve the pages navigation for spiders.

To extract income from a site. Adsense ads are an easy way to Monetize a website.

A JavaScript instruction that is used to test the current position of the mouse. For example, it is widely used to change the appearance of a button used as a hyperlink to another page. Two buttons are created: one in the normal state, the other altered, typically having a “depressed” look. Using “on mouseover” in a JavaScript statement that relates to the image enables the altered image to be displayed.

Movable Type
For sale blogging software which allows you to host a blog on your website.

Mozilla Firefox
An open source and platform independent web browser that has steadily grown in popularity over the last few years. It’s currently used by about 15% of the world’s Web browsers.

Refers to Microsoft Network and their search engine

MSN Search
Search engine built by Microsoft. MSN is the default search provider in Internet Explorer.

Multi Dimensional Scaling
The process of taking shapshots of documents in a database to discover topical clusters through the use of latent semantic indexing. Multi dimensional scaling is more efficient than singular vector decomposition since only a rough approximation of relevance is necessary when combined with other ranking criteria.

One of the most popular social networking sites, largely revolving around connecting musicians to fans and having an easy to use blogging platform.

Naked Links
A posted and visible link in the text of a web page that directs to a web site.

Natural Language Processing
Algorithms which attempt to understand the true intent of a search query rather than just matching results to keywords.

Natural search results
The search engine results which are not sponsored, or paid for in any way.

Scheme to help website users understand where they are, where they have been, and how that relates to the rest of your website. It is best to use regular HTML navigation rather than coding your navigation in JavaScript, Flash, or some other type of navigation which search engines may not be able to easily index.

Navigation bar (nav bar)
A web site’s navigation icons, usually arranged in a row down the left hand side or along the top that plays crucial roles in directing spiders to the site’s most important content and in getting site visitors to go deeper in the site

Negative Keyword
Negative Keyword is a term referenced by Google AdWords and is a form of keyword matching. This means that an advertiser can specify search terms that they do not want their ad to be associated with.

Negative SEO
The act of demoting a page or site from the SERPS. Most often used against a competitor that is above your site in the SERPS but can be used purely for fun.

Originally a company that created a popular web browser by the same name, Netscape is now a social news site similar to Digg.com.

A topic or subject which a website is focused on.

A non-standard HTML link attribute used to prevent a link from passing link authority. Technically, ‘nofollow’ instructs a search engine bot *not* to follow the link and that the link should not influence the link target’s ranking in the search engine’s index. It is intended to reduce the effectiveness of certain types of spamdexing, thereby improving the quality of search engine results. Commonly used on sites with user generated content, like in blog comments. The code to use nofollow on a link appears like <a href=”http://www.seoinsites.com.com” rel=”nofollow”>seoINsites</a>. Nofollow can also be used in a robots meta tag to prevent a search engine from counting any outbound links on a page. This code would look like this <META NAME=”ROBOTS” CONTENT=”INDEX, NOFOLLOW”>

NoFollow Tag
An attribute webmasters can place on links that tell search engines not to count the link as a vote or not to send any trust to that site. Search engines will follow the link, yet it will not influence search results. NoFollows can be added to any link with this code: rel=”nofollow”.

NoFrames Tag
A tag used to describe the content of a frame to a user or engine which had trouble displaying or reading frames. This tag is frequently misused to the point that its often referred to as the ‘Poor mans cloaking’.

A command found in either the HEAD section of a web page or within individual link code, which instructs robots to not index the page or the specific link.

Non reciprocal link
If site A links to site B, but site B does not link back to site A, then the link is considered non reciprocal. Search engines tend to give more value to non-reciprocal links than to reciprocal ones because they are less likely to be the result of collusion between sites.

NoScript Tag
A tag used to define an alternate content (text) if a script is NOT executed. This tag is used for browsers that recognize the <script> tag, but do not support the script in the tag.

Refers to content specific to a particular topic.

As it relates to search, it is the attempt to create an exhaustive and rigorous conceptual schema about a domain. An ontology is typically a hierarchical data structure containing all the relevant entities and their relationships and rules within that domain. Used in keyword research to find all related terms to a core set of keywords.

Open Source
Software which is distributed with its source code such that developers can modify it as they see fit. On the web open source is a great strategy for quickly building immense exposure and mindshare.

A fast standards based web browser.

Organic link
Links published only because the webmaster considers them to add value for users.

Organic Results
Listings on search engine result pages (SERPs) that are not paid for and are not for sale. Sites appear in organic (also called ‘natural’) results because a search engine has applied formulas (algorithms) to its search crawler index, combined with editorial decisions and content weighting, that it deems important enough to include without payment.

Organic Search Rankings
Search engine ranking of web pages found in SERPs.

Outbound Links
Links on a particular web page leading to other web pages either within the same site, or other web sites.

The company which pioneered search marketing by selling targeted searches on a pay per click basis. Originally named GoTo, they were eventually bought out by Yahoo! and branded as Yahoo! Search Marketing

Overture Keyword Selector Tool
Popular keyword research tool, based largely on Yahoo! search statistics. Heavily skewed toward commercially oriented searches, also combines singular and plural versions of a keyword into a single version.

Pay for Performance

Page, Larry
Co-founder of Google.

A logarithmic scale based on link equity which estimates the importance of web documents.
Since PageRank is widely bartered Google’s relevancy algorithms had to move away from relying on PageRank and place more emphasis on trusted links via algorithms such as TrustRank.
The PageRank formula is:
PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))
PR= PageRank
d= dampening factor (~0.85)
c = number of links on the page
PR(T1)/C(T1) = PageRank of page 1 divided by the total number of links on page 1, (transferred PageRank)
In text: for any given page A the PageRank PR(A) is equal to the sum of the parsed partial PageRank given from each page pointing at it multiplied by the dampening factor plus one minus the dampening factor.
PageRank (PR)
PR is the Google technology developed at Stanford University for
placing importance on pages and web sites. At one point, PageRank (PR) was a major
factor in rankings. Today it is one of hundreds of factors in the algorithm that determines
a page’s rankings.

Paid Inclusion
Is the process of paying a fee to a search engine in order to be included in that search engine’s result pages or directory. Also known as ‘guaranteed inclusion’. Paid inclusion does not impact rankings of a web page; it merely guarantees that the web page itself will be included in the index. These programs were typically used by web sites that were not being fully crawled or were incapable of being crawled, due to dynamic URL structures, frames, etc.

Paid Listings
Listings that search engines sell to advertisers, usually through paid
placement or paid inclusion programs. In contrast, organic listings are not sold.

Paid Placement
Advertising program where listings are guaranteed to appear in
response to particular search terms, with higher ranking typically obtained by paying
more than other advertisers. Paid placement listings can be purchased from a portal or
a search network. Search networks are often set up in an auction environment where
keywords and phrases are associated with a cost-per-click (CPC) fee. Overture and
Google are the largest networks, but MSN and other portals sometimes sell paid
placement listings directly as well. Portal sponsorships are also a type of paid

pay for inclusion (PFI)
The practice of charging a fee to include a website in a search engine or directory. While quite common, usually what is technically paid for is more rapid consideration to avoid Googles prohibition on paid links.

PDF (Portable Document Format)
a universal file format developed by Adobe Systems that allows files to be stored and viewed in the original printer friendly context.

Search engines prevent some websites suspected of spamming from ranking highly in the results by banning or penalizing them. These penalties may be automated algorithmically or manually applied.
If a site is penalized algorithmically the site may start ranking again after a certain period of time after the reason for being penalized is fixed. If a site is penalized manually the penalty may last an exceptionally long time or require contacting the search engine with a reinclusion request to remedy.
PERL (Practical Extraction Report Language)

Altering the search results based on a person’s location, search history, content they recently viewed, or other factors relevant to them on a personal level.

Current definition needs to be updated with the “add the &pws=0 attribute to the end of your Google browser URLs to return non-customized results in Google search”

PFP (Pay for Performance)
Payment structure where affiliated sales workers are paid commission for getting consumers to perform certain actions.
Publishers publishing contextual ads are typically paid per ad click. Affiliate marketing programs pay affiliates for conversions – leads, downloads, or sales.

PHP Hypertext Preprocessor is an open source server side scripting language used to render web pages or add interactivity to them

“A media file that is distributed over the internet using
syndication feeds, for playback on portable media players and personal computers. Like
‘radio,’ it can mean both the content and the method of syndication. The latter may also
be termed podcasting. The host or author of a podcast is often called a podcaster.”
(Source: Wikipedia)

Poison Word
Words which were traditionally associated with low quality content that caused search engines to want to demote the rankings of a page

A web service which offers a wide array of features to entice users to make the portal their ‘home page’ on the web. Google, Yahoo, and MSN are portals.

Web site offering common consumer services such as news, email, other content, and search.

In PPC advertising, position is the placement on a search engine results
page where your ad appears relative to other paid ads and to organic search results.

Position Preference
A feature in Google AdWords and in Microsoft adCenter
enabling advertisers to specify in which positions they would like their ads to appear on
the SERP. Not a position guarantee.

PPA (Pay Per Action )
Very similar to Pay Per Click except publishers only get paid when click throughs result in conversions.

PPC (Pay Per Click)
pay per click is a pricing model which most search ads and many contextual ad programs are sold through — Google’s AdWords is a prime example. PPC ads only charge advertisers when a potential customer actually clicks an ad.

PPC Advertising
A model of online advertising in which advertisers pay only for each click on their ads that directs searchers to a specified landing page on the advertiser’s web site. PPC ads may get thousands of impressions (views or serves of the ad); but, unlike more traditional ad models billed on a CPM (Cost-Per-Thousand-Impressions) basis, PPC advertisers only pay when their ad is clicked on. Charges per ad click-through are based on advertiser bids in hybrid ad space auctions and are influenced by competitor bids, competition for keywords and search engines’ proprietary quality measures of advertiser ad and landing page content.

PPC Management
The monitoring and maintenance of a Pay-Per-Click campaign. This includes changing bid prices, expanding and refining keyword lists, editing ad copy, testing campaign components for cost effectiveness and successful conversions, and reviewing performance reports for reports to management and clients, as well as results to feed into future PPC campaign operations.

Acronym for Pay-Per-Click Search Engine.

PR Web
Our recommendation for online press release submission. PR Web provides an optimized platform to submit social media releases and SEO-optimized releases to the widest possible audience at the lowest possible cost.

The ability of a search engine to list results that satisfy the query.

Profit Elasticity
A measure of the profit potential of different economic conditions based on adjusting price, supply, or other variables to create a different profit potential where the supply and demand curves cross.

A measure of how close words are to one another.

Quality Link
Search engines count links votes of trust. Quality links count more than low quality links.
There are a variety of ways to define what a quality link is, but the following are characteristics of a high quality link:
Trusted Source: If a link is from a page or website which seems like it is trustworthy then it is more likely to count more than a link from an obscure, rarely used, and rarely cited website. See TrustRank for one example of a way to find highly trusted websites.
Hard to Get: The harder a link is to acquire the more likely a search engine will be to want to trust it and the more work a competitor will need to do to try to gain that link.
Aged: Some search engines may trust links from older resources or links that have existed for a length of time more than they trust brand new links or links from newer resources.
Co-citation: Pages that link at competing sites which also link to your site make it easy for search engines to understand what community your website belongs to. See Hilltop for an example of an algorithm which looks for co-citation from expert sources.
Related: Links from related pages or related websites may count more than links from unrelated sites.
In Content: Links which are in the content area of a page are typically going to be more likely to be editorial links than links that are not included within the editorial portion of a page.

Quality Score
A number assigned by Google to paid ads in a hybrid auction that, together with maximum CPC, determines each ad’s rank and SERP position. Quality Scores reflect an ad’s historical CTR, keyword relevance, landing page relevance, and other factors proprietary to Google. Yahoo! refers to the Quality Score as a Quality Index. And both Google and Yahoo! display 3- or 5-step indicators of quality evaluations for individual advertisers.

A word or phrase entered into a search engine or database.

Query Refinement
Query Refinement is an essential information retrieval tool that interactively recommends new terms related to a particular query.

The particular order or position a web page or web site appears in search engine results.
Rank and position affect your click-through rates and, therefore, conversion rates for your landing pages.

The portion of relevant documents that were retrieved when compared to all relevant documents.

Reciprocal Links
A reciprocal link is a mutual link between two web sites in order to ensure mutual traffic. Search engines usually don’t count these as valuable links

Reconsideration Request
A request to Google to manually review your site submitted through a verified Google Webmaster Tools account.

Methods used to change the address of a landing page when a site is moved to a new domain, or location.

The source from which web site visitors come from.

If a site has been banned for spamming they may fix the infraction and ask for reinclusion in a search index.

As related to PPC advertising, relevance measures how closely the searcher’s expectations and the search query are tied to the description, keywords and ad title.

To measure of how useful searchers find search results.

Reputation Management
Ensuring your brand related keywords display search results which reinforce your brand.

A program which is offered by businesses bilking naive consumers out of their money for a worthless service.

Reverse Index
This is a database index that uses the reversal of the key values rather than the values themselves.

Revshare / Revenue Sharing
A method of allocating PPC revenue to a site publisher, and click-through charges to a search engine that distributes paid-ads to its context network partners, for every page viewer who clicks on the content site’s sponsored ads.

Rich Media
Text that includes formatting commands for page layout such as fonts, bold, underline, italic, etc. It may also refer to a multimedia document that can include graphics, audio and video.

Rich Snippets
On 12 May 2009, Google announced that they would be parsing the hCard, hReview and hProduct microformats, and using them to populate search result pages with what they called “Rich Snippets”.

Right Rail
The right-side column of a web page. On a Search Engine Results Page (SERP), the right rail is usually where sponsored listings appear.

ROAS (Return On Advertising Spending)
The profit generated by ad campaign conversions per dollar spent on advertising expenses.

A text file placed in the root directory of a web site that prohibits crawlers/spiders from indexing all or specific pages of the site.

ROI (Return On Investment)
A measure of how much return you receive from each marketing dollar.

RSS (Really Simply Syndication, Rich Site Summary, RDF Site Summary)
A syndication format designed for aggregating updates to blogs and the news sites.

Software as a Service

A popular web browser for Apple computers.

Salton, Gerard
Computer scientist who was among the pioneers of the information retrieval field.

Saturation (Search Engine Saturation)
A term describing how much content from a particular website is included in a given search engine. A higher saturation gives a higher potential of traffic and page ranking.

A blog script written in PHP that uses MySQL for the back-end.

copying content from a site, often facilitated by automated bots. – Definition revised based upon advice from Michael Martinez

Scraper sites
Designed to ‘scrape’ search-engine results pages or other sources of content (often without permission) to create content for a website. Scraper sites are generally full of advertising or redirect the user to other sites.

Search Directory
A listing of websites and URLs that depends on user submission to create the listing, which is organized in a logical system, instead of using spiders or crawlers to obtain information for its database.

Search Engine
A tool or device designed to search and retrieve information, comprising of a spider, index, relevancy algorithms and search results.

Search History
User search history information stored by the search engine to help improve the quality and reliability of future search results.

Search Query
The word or phrase a searcher types into a search field, which initiates
search engine results page listings and PPC ad serves. In PPC advertising, the goal is to bid on keywords that closely match the search queries of the advertiser’s targets. See also Query.

Search Submit Pro (SSP)
Search Submit Pro is Yahoo’s paid inclusion product that uses a ‘feed’ tactic. This product crawls your web site as well as an optimized XML feed that represents the content on your site. Yahoo then applies its algorithm to both the actual web site pages and the XML feed to determine which listing is most appropriate to appear in the organic search results when a user conducts a search for relevant terms. Yahoo charges a cost per click (CPC), determined by category, for each time a listing established through (SSP) is clicked.

Search Terms
The words (or word) a searcher enters into a search engine’s search
box. Also used to refer to the terms a search engine marketer hopes a particular page
will be found for. Also called keywords, query terms or query.

Secondary Links
Links that are indirectly acquired links, such as a story in a major
newspaper about a new product your company released.

SEM (Search engine marketing)
The marketing of websites for search engines, usually through methods such as SEO, buying pay per click adds or trusted feeds.

Semantic Clustering
A technique for developing relevant keywords for PPC Ad Groups, by focusing tightly on keywords and keyword phrases that are associative and closely related, referred to as “semantic clustering.” Focused and closely-relate dkeyword groups, which would appear in the advertiser’s ad text and in the content of the click-through landing page, are more likely to meet searchers’ expectations and, therefore, support more effective advertising and conversion rates.

SEO (Search engine optimization)
The act of publishing and marketing information in a way to improve search engine understanding of website content.

SEO Copywriting
A style of writing and formatting information to increase the exposure to search queries by enhancing the relevance of the document to the search engine criteria and algorithm.

Acronym for: Search Engine’s Point Of View

(Search Engine Results Page) The page where the results for a search query returned by a search engine is presented. Sometimes referred to as SERPs for multiple pages.

Server-side Tracking
The process of analyzing web server log files. Server-side
analytics tools make sense of raw data to generate meaningful reports and trends

Session IDs
Dynamic parameters, such as session IDs generated by cookies for
each individual user. Session IDs cause search engines to see a different URL for each
page each time they return to re-crawl a web site.

Share of Voice
“A brand’s (or group of brands’) advertising weight, expressed as a
percentage of a defined total market or market segment in a given time period. Share of Voice (SOV)
advertising weight is usually defined in terms of expenditure, ratings, pages, poster
sites, etc.” (Source: Wikipedia)

Siloing (also known as Theming)
Siloing is a site architecture technique used to split he focus of a site into multiple themes. The goal behind siloing is to create a site that ranks well for both its common and more-targeted keywords. (Source: Bruce Clay Newsletter 09/06)

The practice of stealing traffic from another web sites traffic

Site Map
A specific page on a website that provides search engines an alternative access to contents and information, and also helps webmasters and visitors to make sense of the information contained in the website.

Site-Targeted Ads
Site targeting lets advertisers display their ads on manuallyselected
sites in the search engine’s content network for content or contextual ad
serves. Site-targeted ads are billed more like traditional display ads, per 1000
impressions (CPM), and not on a Pay-Per-Click basis.

A tall, thin ad unit that runs down the side of a web page. A skyscraper can be 120 x 600 pixels or160 x 600 pixels.

Sniffer script
a small program or script that detects which web browser software an Internet user is using and then serves up the particular browser-specific cascading style sheet to match. Sniffer scripts are also used to detect whether a user has the Macromedia Flash plug-in installed, and if so, a Flash version of the page is displayed. Also known as User_Agent Detection.

social bookmark
A form of Social Media where users bookmarks are aggregated for public access.

social media
Various online technologies used by people to share information and perspectives. Blogs, wikis, forums, social bookmarking, user reviews and rating sites (digg, reddit) are all examples of Social Media.

social media marketing (SMM)
Website or brand promotion through social media

social media poisoning (SMP)
A term coined by Rand Fishkin – any of several (possibly illegal) black hat techniques designed to implicate a competitor as a spammer – For example blog comment spamming in the name / brand of a competitor

Social Media Release SMR
A press release written with social media in mind that references social tags, anchor text keywords, RSS feeds, and other ancillary files to promote a conversation with its online audience.

sock puppet
an online identity used to either hide a persons real identity or to establish multiple user profiles

Unsolicited email messages that can have malicious or benign intent.
spam ad page (SpamAd page) A Made For Adsense/Advertisement page which uses scraped or machine generated text for content, and has no real value to users other than the slight value of the adds. Spammers sometimes create sites with hundreds of these pages.
spamdexing Spamdexing or search engine spamming is the practice of deceptively modifying web pages to increase the chance of them being placed close to the beginning of search engine results, or to influence the category to which the page is assigned in a dishonest manner. – Wikipedia

The processes involved in creating and distributing spam.

Search engine robots that processes content on the Internet to create an index of pages used to provide Internet users results for their search query.

Spider trap
An infinite loop that a spider may get caught in if it explores a dynamic site where the URLs of pages keep changing. For example, a home page may have a different URL and the search engine may not be able to ascertain that it is the home page that it has already indexed but under another URL. If search engines were to completely index dynamic web sites, they would inevitably have large amounts of redundant content and download millions of pages.

Splash Page
A term with refers to web pages that are lavishly designed, but offer very little value or content for search engines.

Spam blog, in reference to either stolen or automatically generated blog content.

(aka) Spam Blog – An artificially created weblog site that’s used to promote affiliated websites or to increase the search engine rankings of associated sites. The purpose of a splog can be to increase the PageRank or backlink portfolio of affiliate websites, to artificially inflate paid ad impressions from visitors or even as a link outlet to get new sites indexed. Spam blogs are usually some type of scraper site, where content is often stolen from other websites. These blogs usually contain a high number of links to sites associated with the splog creator which are often disreputable or otherwise useless websites.There is frequent confusion between the terms “splog” and “spam in blogs”. Splogs are blogs where the articles are fake, and are only created for search engine spamming. To spam in blogs, conversely, is to include random comments on the blogs of innocent bystanders, in which spammers take advantage of a site’s ability to allow visitors to post comments that may include links.

Sponsored Listing
A term used as a title or column head on SERPs to identify paid
advertisers and distinguish between paid and organic listings. Alternate names are Paid
Listings or Paid Sponsors. Separating paid listings from organic results enables
searchers to make their own purchase and site trust decisions and, in fact, resulted from
an FTC complaint filed by Commercial Alert in 2001 alleging that the confusion caused
in consumers who saw mixed paid and unpaid results constituted fraud in advertising.

Software programs that are often used to collect data on consumer behavior and habits to provide targeted advertising.

A popular user-driven platform for publishing web contents for social networking purposes.

SSI (Server Side Includes)
A convenient way to access parts of a web page from a different page, typically implemented through the creation of .shtml or .shtm files or by accessing or modifying the .htaccess file.

Static Content
Website content which does not change frequently, or lack interactive elements that can dynamically respond to or reflect user input, such as those enabled by dynamic programming languages.

Using the stem of a word to help broaden the scope of the search, and improve the quality of the search results.

The amount of time user spends on a website, or the ability of a website to engage visitors and make them navigate to other pages on the same website.

Stop character
Certain characters, such as ampersand (&), equals sign (=), and question mark (?), when in a web page’s URL, tip off a search engine that the page in question is dynamic. Search engines are cautious of indexing dynamic pages for fear of spider traps, thus pages that contain stop characters in their URL run the risk of not getting indexed and becoming part of the “Invisible Web.” Google won’t crawl more than one dynamic level deep. So dynamic pages with stop characters in its URL should get indexed if a static page links to it. Eliminating stop characters from all URLs on your site will go a long way in ensuring that your entire site gets indexed by Google.

Stop Words
Word typed into to a search query, but are removed prior to finding and retrieving search results due to their very frequent appearance and thus irrelevance to the search. Search engines are now rethinking their position on the use of stop words to increase the quality of search results.

Streaming media
audio-visual content that is played as it is being downloaded. Thus, an Internet user could begin watching a video clip as the footage downloads rather than having to wait for the clip to download in its entirety beforehand.

The process of creating or enhancing awareness of your website, either by submitting the web page to the search engine, or establishing links with websites that are in the search engine index.

Supplemental Results
Documents not listed on the main search index, and are considered to be trusted less. Listings in the Supplemental results rank lower than documents in the main search index due to limited link authority relative to the number of pages on the site, duplicate content or near duplication or even exceptionally complex URLs. Google at one time allowed users to see if a URL was in the supplemental index in their search result, but removed that indication.

An option that allows you to extend your reach by distributing ads to additional partner sites.

Tagging, tags
Word descriptions associated with a web page or content

Tail Terms
Specific and long phrases that include one or more modifiers designed to use words that are more unique to reduce competition and improve conversion rates.

Target Audience
The group of people that are the primary consumers of the products or services provided by a particular business, that the advertisers want to attract.

The act of narrowing or focusing the use of keywords in advertising to attract a specific target group. This can be based on gender, age, behavior or specific interests.

A system, which incorporates a specific set of terms, used to organize contents in a systematic or hierarchical manner.

One of the most popular search engine for blogs.

A program or service that uses an underlying TCP/IP protocol to provide remote computers with access to local computers and vice versa.

An internet search engine based on Kleinberg’s concept of hubs and authorities. It is used to power websites such as Ask.com.

Term Frequency
The number of occurrences of a keyword amongst a collection of text or documents.

Term Vector Database
A database that incorporates the use of relative vectors (or relevancy) in searching on keyword phrases and website content.

Text Link Ads
Advertisements formatted to resemble text links.

The Tragedy of the Commons
The conflict between the interests of individuals and the common good of the group.

The overall topic or content of a website. This is usually determined by the search engine based on a word usage and density analysis of the contents on the page.

Tools used to find words of similar meaning in different context to assist in improving search results, or for advertisers to find potential keywords.

Tier I Search Engines
Refers to the big three GYM, for Google, Yahoo! and Microsoft Live Search search engines, which handles the majority of the search queries on the Internet.

Tier II Search Engines
Incorporates a range of smaller, vertical and specialized engines, including general
Search engines or meta search engines. Even though Tier II Search Engines lack the market share or features of the Tier I engines, they can still be effective in targeting specific or niche markets at a more affordable price

Tier III Search Engines
These operate mainly on contextual distribution networks, and are triggered by the result of user searches, where the keywords used in the query are linked to advertisers that are listed on the search engine to bring up the advertisement relevant to the search results.

Title Tag
The <head> tag of a web page that normally contains the page title. This is also the information used by the search engine to show the bold blue underlined hyperlink of search query results. Historically the title tag has been the most important section of a web page in regards to search engine ranking.

TLP (Top Level Page)
A reference to pages in a website structured in the top levels because of the unique value they provide for the site

An add-on to a web browser designed to provide additional functionalities or easy access to existing function

Topic-Sensitive PageRank
Alternate method of computing PageRank. Takes into account of the theme and content of pages when computing a PageRank score.

A function that enables the linking of posts to other blogs with similar content or topic so that links and references to their page or content can be easily accessed. The ‘Trackback URL’ of a post shows all other blogs that have links to it.

Tracking URL
A URL whose function is store and report information from a user action. It is designed to show information such as the keyword used, what match type was triggered, and the search engine used to reach the website, and it used commonly to track the conversion rate of paid advertising services.

Any symbols, pictures or words that distinctively identify, or are associated with a specific product or service.

The number of visits or visitors to a website.

Traffic Analysis
The process of analyzing website traffic with the aim to understand user activity on the website as well as the source of traffic to the website.

Trusted Feed
A trusted feed is a paid service offered by some search engines to allow advertisements to appear in designated areas, without having to change the contents of the website. Interchangeable with the term Paid Inclusion

A weighted search relevancy algorithm designed with a bias towards trusted seed websites of major corporations, educational institutions, or governmental institutions.

Hosted blogging platform that allows users to publish content on a simulated unique domain. Designed to give more credibility and individuality to the content, while sacrificing some of the user generated traffic and credit to the website.

Unethical SEO
Search engine optimization techniques that are considered to be unscrupulous by search engines which can lead to the banning of a website. Examples include Keyword stuffing, Hidden text and Doorway pages

Unique visitors
A tally or count of individual users (rather than user agents or bots) who have visited your web site (over a specified period of time), as opposed to user sessions, which can involve an individual user visiting the website multiple times. This is determined by the number of unique user identifiers registered on the website (usually the IP address).

A term which encompasses the entire population of the market or target group.

URL (Uniform Resource Locater)
The unique address of a document or resource on the Internet. The term is often used interchangeably with web address.

URL Rewrite
A technique for creating more unique and informative URLs so that they can be indexed by major search engines more effectively. It also helps to make the URL more user-friendly. URL Rewriting is often done using the Apache MOD_REWRITE function to convert a dynamic URL to what appears to be a static URL

The ease with which a user can perform an action

Usage Data
Data relating to the amount of traffic visitors, page views statistics, click through rate, or search queries may be seen by some search engines as a sign of quality.

A specialized search service aimed to provide information for particular fields, a type of information, or information format.

User agent
The name of the browser/spider that is currently visiting a page.

User session
A period of user interaction with the website, with an arbitrary length inactivity by the user usually assigned to indicate the end of a session.

Value Propositions
When referring to a customer, it is the sum of the all the benefits that a company promises to the customer in return for the agreed transaction. The promise is generally laid out in the marketing and sales information, with the customer to receive the benefits for engaging in the company’s service and completing the payment.

Vector Space Model
(see Term Vector Database)

Vertical Creep
Describes a trend when vertical listing rank highly on organic search engine results but fail to register equally on sponsored listings (as they apply to SERP).

Vertical Portal / Vortal
Search engines that focus on a specific vertical, or a specific industry/sector. These typically contain more specific indexes that produce better search results compared to Tier I search engines.

Vertical Search
A type of search service that focuses on the field, type of information, or information format rather than specific keywords.

A term used to refer to a specific business group or category.

Viral Marketing
An online ‘word-of-mouth’ tactic of marketing that relies on self-propagating or self-replicating methods to circulate within multiple networks aimed at trying to reach a large number of audience in a short amount of time. Viral marketing is usually spread using email, blogs, and other social networking or marketing channels.

Virtual Domain
A website that is hosted on a virtual server.

Virtual Server
A server designed to host multiple top level domains that can be run from a single computer.

A general term for the describing the amount of exposure a website receives in search engine keyword searches.

A term used to indicate the direct or indirect navigation of a user to a website.

Wales, Jimmy
Co-founder of the popular collaborative online encyclopedia Wikipedia.

walled garden
A collection of web pages that are linked to each other only, without any external links or references. Walled gardens that are included in the sitemap can still be indexed by search engines, but will probably receive a poor page rank due to their lack of deep links.

Web browser
Software that is installed on the user’s computer for viewing the contents of websites. Among the list of popular web browsers are Microsoft Internet Explorer, Firefox, Safari and Opera.

Web Crawler, web robot, web spider
A program or automated script used to navigate and browse the contents of the World Wide Web in a systematic manner.

Web Forwarding
A facility that allows redirection to occur on a separate server by providing specific directives on the .htaccess file.

Web standards
Standards and guidelines such as CSS, XHTML that help ensure inter-platform operability

Web TV
Television set-top boxes designed for users without computers or do not wish to use computers to browse the Internet.

see Blog

White Hat SEO
Techniques and practices that fall within the accepted practices and guidelines based on how search engines operate. Normally used to refer to SEO practices that are not deceptive or misleading in nature, as opposed to Black Hat SEO.

Refers to a small program or application written to perform a specific function on a website, such as a hit counter or IP address display

The software which provides the facilities for collaborative editing and publishing on the Wikipedia website.

An multilingual, user-driven online encyclopedia based on public collaboration via the use of the Free wiki software.

A lexical database for English words that is often used to improve search engines performance by providing it with information on word relationships.

A very powerful and popular open source blogging software platform, consisting of a downloadable blogging program combined with a hosted solution. Fully integrated and feature packed, it offers a very high degree of user control as a content management system for web publishing.

A powerful and popular keyword research tool that collects data from popular meta search engines. Designed to help webmasters optimize keywords for their website, the program suffers from its small list of databases

Xenu, Xenu Link Sleuth
A popular and free program used to detect broken links on websites. Also allows user to create a sitemap of their website.

XHTML (Extensible HyperText Markup Language)
A set of specifications designed to allow the migration of HTML to conform with XML standards.

XML (Extensible Markup Language)
A scripting language that provides programmers with more flexibility and power to define the properties of the document, making it easier to syndicate or format information.

XML Feeds
An alternate method for websites to update content to search engines, by sending XML information about website content and changes, rather than simply having search engine bots crawl through the website. XML Sitemaps which list all the urls on a site that you want listed are accepted by most major engines.

XML Sitemap
An XML Sitemap (aka a Google Sitemap, although it’s used by Yahoo and Microsoft as well) is a special file that provides search engines with specific directives about what pages to crawl and how often. Search engines are not required to strictly obey these directives but they do tend to use them as guidelines.This type of Sitemap is especially useful for very large sites that want to get all their pages listed. A great example of a large site that NEEDS to have a good XML Sitemap is an ecommerce site that wants to get its entire list of product pages indexed and listed in the search results.

One of the top three ranked search engines that is among the oldest and most well-established Internet directories.

Yahoo! Answers
A free member-driven Q&A service provided by Yahoo! used to generate free content using the popularity of social networks.

Yahoo! Directory
Since its conception in 1994, it has maintained a reputation for being among the most innovative, popular, and authoritative Internet directories.

Yahoo! Search Marketing
Yahoo!’s version of its paid search platform that was based on Overture.

Yahoo! Site Explorer
A research tool offered to webmasters, containing information about the pages that Yahoo! indexes from websites as well as the links that are associated with those pages.

Google owned amateur video upload and syndication website, which saw phenomenal success and popularity due to its simplicity and innovation.

A Non-commercial directory first launched in 1999 and was bought by Looksmart in 2000. Originally a collaborative effort between Looksmart editors and volunteers from all over the world, it’s declining traffic after the withdraw of MSN as one of its partners let to its closure in 2006.

Google Basics

Learn how Google discovers, crawls, and serves web pages

When you sit down at your computer and do a Google search, you’re almost instantly presented with a list of results from all over the web. How does Google find web pages matching your query, and determine the order of search results?

In the simplest terms, you could think of searching the web as looking in a very large book with an impressive index telling you exactly where everything is located. When you perform a Google search, our programs check our index to determine the most relevant search results to be returned (“served”) to you.


Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

We use a huge set of computers to fetch (or “crawl”) billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

Google doesn’t accept payment to crawl a site more frequently, and we keep the search side of our business separate from our revenue-generating AdWords service.


Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, we process information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.

Serving results

When a user enters a query, our machines search the index for matching pages and return the results we believe are the most relevant to the user. Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site’s PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.

In order for your site to rank well in search results pages, it’s important to make sure that Google can crawl and index your site correctly. Our Webmaster Guidelines outline some best practices that can help you avoid common pitfalls and improve your site’s ranking.

Google’s Did you mean and Google Autocomplete features are designed to help users save time by displaying related terms, common misspellings, and popular queries. Like our google.com search results, the keywords used by these features are automatically generated by our web crawlers and search algorithms. We display these predictions only when we think they might save the user time. If a site ranks well for a keyword, it’s because we’ve algorithmically determined that its content is more relevant to the user’s query.

Search Engine Optimization (SEO)

Doctrine According to Google

SEO is an acronym for “search engine optimization” or “search engine optimizer.” Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your site and reputation. Make sure to research the potential advantages as well as the damage that an irresponsible SEO can do to your site. Many SEOs and other agencies and consultants provide useful services for website owners, including:

  • Review of your site content or structure
  • Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript
  • Content development
  • Management of online business development campaigns
  • Keyword research
  • SEO training
  • Expertise in specific markets and geographies.

Keep in mind that the Google search results page includes organic search results and often paid advertisement (denoted by the heading “Sponsored Links”) as well. Advertising with Google won’t have any effect on your site’s presence in our search results. Google never accepts money to include or rank sites in our search results, and it costs nothing to appear in our organic search results. Free resources such as Webmaster Tools, the official Webmaster Central blog, and our discussion forum can provide you with a great deal of information about how to optimize your site for organic search.

Before beginning your search for an SEO, it’s a great idea to become an educated consumer and get familiar with how search engines work. We recommend starting here:

If you’re thinking about hiring an SEO, the earlier the better. A great time to hire is when you’re considering a site redesign, or planning to launch a new site. That way, you and your SEO can ensure that your site is designed to be search engine-friendly from the bottom up. However, a good SEO can also help improve an existing site.

Some useful questions to ask an SEO include:

  • Can you show me examples of your previous work and share some success stories?
  • Do you follow the Google Webmaster Guidelines?
  • Do you offer any online marketing services or advice to complement your organic search business?
  • What kind of results do you expect to see, and in what timeframe? How do you measure your success?
  • What’s your experience in my industry?
  • What’s your experience in my country/city?
  • What’s your experience developing international sites?
  • What are your most important SEO techniques?
  • How long have you been in business?
  • How can I expect to communicate with you? Will you share with me all the changes you make to my site, and provide detailed information about your recommendations and the reasoning behind them?

While SEOs can provide clients with valuable services, some unethical SEOs have given the industry a black eye through their overly aggressive marketing efforts and their attempts to manipulate search engine results in unfair ways. Practices that violate our guidelines may result in a negative adjustment of your site’s presence in Google, or even the removal of your site from our index. Here are some things to consider:

    • Be wary of SEO firms and web consultants or agencies that send you email out of the blue.
      • Amazingly, we get these spam emails too:

        “Dear google.com,
        I visited your website and noticed that you are not listed in most of the major search engines and directories…”

        Reserve the same skepticism for unsolicited email about search engines as you do for “burn fat at night” diet pills or requests to help transfer funds from deposed dictators.

    • No one can guarantee a #1 ranking on Google.
      • Beware of SEOs that claim to guarantee rankings, allege a “special relationship” with Google, or advertise a “priority submit” to Google. There is no priority submit for Google. In fact, the only way to submit a site to Google directly is through our Add URL page or by submitting a Sitemap and you can do this yourself at no cost whatsoever.
    • Be careful if a company is secretive or won’t clearly explain what they intend to do.
      • Ask for explanations if something is unclear. If an SEO creates deceptive or misleading content on your behalf, such as doorway pages or “throwaway” domains, your site could be removed entirely from Google’s index. Ultimately, you are responsible for the actions of any companies you hire, so it’s best to be sure you know exactly how they intend to “help” you. If an SEO has FTP access to your server, they should be willing to explain all the changes they are making to your site.
    • You should never have to link to an SEO.
      • Avoid SEOs that talk about the power of “free-for-all” links, link popularity schemes, or submitting your site to thousands of search engines. These are typically useless exercises that don’t affect your ranking in the results of the major search engines — at least, not in a way you would likely consider to be positive.
    • Choose wisely.
      • While you consider whether to go with an SEO, you may want to do some research on the industry. Google is one way to do that, of course. You might also seek out a few of the cautionary tales that have appeared in the press, including this article on one particularly aggressive SEO: http://seattletimes.nwsource.com/html/businesstechnology/2002002970_nwbizbriefs12.html. While Google doesn’t comment on specific companies, we’ve encountered firms calling themselves SEOs who follow practices that are clearly beyond the pale of accepted business behavior. Be careful.
    • Be sure to understand where the money goes.
      • While Google never sells better ranking in our search results, several other search engines combine pay-per-click or pay-for-inclusion results with their regular web search results. Some SEOs will promise to rank you highly in search engines, but place you in the advertising section rather than in the search results. A few SEOs will even change their bid prices in real time to create the illusion that they “control” other search engines and can place themselves in the slot of their choice. This scam doesn’t work with Google because our advertising is clearly labeled and separated from our search results, but be sure to ask any SEO you’re considering which fees go toward permanent inclusion and which apply toward temporary advertising.
    • What are the most common abuses a website owner is likely to encounter?

One common scam is the creation of “shadow” domains that funnel users to a site by using deceptive redirects. These shadow domains often will be owned by the SEO who claims to be working on a client’s behalf. However, if the relationship sours, the SEO may point the domain to a different site, or even to a competitor’s domain. If that happens, the client has paid to develop a competing site owned entirely by the SEO.

Another illicit practice is to place “doorway” pages loaded with keywords on the client’s site somewhere. The SEO promises this will make the page more relevant for more queries. This is inherently false since individual pages are rarely relevant for a wide range of keywords. More insidious, however, is that these doorway pages often contain hidden links to the SEO’s other clients as well. Such doorway pages drain away the link popularity of a site and route it to the SEO and its other clients, which may include sites with unsavory or illegal content.

    • What are some other things to look out for?

There are a few warning signs that you may be dealing with a rogue SEO. It’s far from a comprehensive list, so if you have any doubts, you should trust your instincts. By all means, feel free to walk away if the SEO:

      • owns shadow domains
      • puts links to their other clients on doorway pages
      • offers to sell keywords in the address bar
      • doesn’t distinguish between actual search results and ads that appear on search results pages
      • guarantees ranking, but only on obscure, long keyword phrases you would get anyway
      • operates with multiple aliases or falsified WHOIS info
      • gets traffic from “fake” search engines, spyware, or scumware
      • has had domains removed from Google’s index or is not itself listed in Google

If you feel that you were deceived by an SEO in some way, you may want to report it.

In the United States, the Federal Trade Commission (FTC) handles complaints about deceptive or unfair business practices. To file a complaint, visit: http://www.ftc.gov/ and click on “File a Complaint Online,” call 1-877-FTC-HELP, or write to:

Federal Trade Commission
Washington, D.C. 20580

If your complaint is against a company in a country other than the United States, please file it at http://www.econsumer.gov/.

More Background SEO According to Google

Paid Links

Doctrine According to Google

Help us maintain the quality of Google search results.

We work hard to return the most relevant results for every search we conduct. To that end, we encourage site managers to make their content straightforward and easily understood by users and search engines alike. Unfortunately, not all websites have users’ best interests at heart. Some site owners attempt to “buy PageRank” in the form of paid links to their sites.

Google uses a number of methods to detect paid links, including algorithmic techniques. We also welcome information from our users. If you know of a site that buys or sells links, please tell us by filling out the fields below. We’ll investigate your submissions, and we’ll use your data to improve our algorithmic detection of paid links.


More Background SEO According to Google


Webmaster Tools

Doctrine According to Google

Improve your site’s visibility in Google search results. It’s free.

Google Webmaster Tools provides you with detailed reports about your pages’ visibility on Google. To get started, simply add and verify your site and you’ll start to see information right away. Learn more »

Get Google’s view of your site and diagnose problems
See how Google crawls and indexes your site and learn about specific problems we’re having accessing it.

Discover your link and query traffic
View, classify, and download comprehensive data about internal and external links to your site with new link reporting tools. Find out which Google search queries drive traffic to your site, and see exactly how users arrive there.

Share information about your site
Tell us about your pages with Sitemaps: which ones are the most important to you and how often they change. You can also let us know how you would like the URLs we index to appear.


More Background SEO According to Google


Link Schemes

Doctrine According to Google

Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity. Any links intended to manipulate a site’s ranking in Google search results may be considered part of a link scheme. This includes any behavior that manipulates links to your site, or outgoing links from your site. Manipulating these links may affect the quality of our search results, and as such is a violation of Google’s Webmaster Guidelines.

The following are examples of link schemes which can negatively impact a site’s ranking in search results:

  • Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link
  • Excessive link exchanging (“Link to me and I’ll link to you”)
  • Linking to web spammers or unrelated sites with the intent to manipulate PageRank
  • Building partner pages exclusively for the sake of cross-linking
  • Using automated programs or services to create links to your site

Here are a few common examples of unnatural links that violate our guidelines:

    • Text advertisements that pass PageRank
    • Links that are inserted into articles with little coherence, for example:
      most people sleep at night. you can buy cheap blankets at shops. a blanket keeps you warm at night. you can also buy a wholesale heater. It produces more warmth and you can just turn it off in summer when you are going on france vacation.
    • Low-quality directory or bookmark site links
    • Links embedded in widgets that are distributed across various sites, for example:
      Visitors to this page: 1,472
      car insurance
    • Widely distributed links in the footers of various sites

Note that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate our guidelines. You can prevent PageRank from passing in several ways, such as:

  • Adding a rel=”nofollow” attribute to the <a> tag
  • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Before making any single decision, you should ask yourself: Is this going to be beneficial for my page’s visitors?

It is not only the number of links you have pointing to your site that matters, but also the quality and relevance of those links. Creating good content pays off: Links are usually editorial votes given by choice, and the buzzing blogger community can be an excellent place to generate interest.

If you see a site that is participating in link schemes intended to manipulate PageRank, let us know. We’ll use your information to improve our algorithmic detection of such links.

More Background SEO According to Google


Google Webmaster Guidelines

Doctrine According to Google

Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the “Quality Guidelines,” which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google’s partner sites.

When your site is ready:

Design and content guidelines

    • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
    • Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.
    • Keep the links on a given page to a reasonable number.
    • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
    • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
    • Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content, consider using the “ALT” attribute to include a few words of descriptive text.
    • Make sure that your <title> elements and ALT attributes are descriptive and accurate.
    • Check for broken links and correct HTML.
    • If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

Technical guidelines

    • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
    • Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
    • Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
    • Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it’s current for your site so that you don’t accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you’re using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.
    • Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google’s AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
    • If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
    • Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.
  • Monitor your site’s performance and optimize load times. Google’s goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let’s Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.

Quality guidelines

These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here. It’s not safe to assume that just because a specific deceptive technique isn’t included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

If you believe that another site is abusing Google’s quality guidelines, please let us know by filing a spam report. Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. While we may not take manual action in response to every report, spam reports are prioritized based on user impact, and in some cases may lead to complete removal of a spammy site from Google’s search results. Not all manual actions result in removal, however. Even in cases where we take action on a reported site, the effects of these actions may not be obvious.

Quality guidelines – basic principles

    • Make pages primarily for users, not for search engines.
    • Don’t deceive your users.
    • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you, or to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
    • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

Quality guidelines – specific guidelines

Avoid the following techniques:

Engage in good practices like the following:

    • Monitoring your site for hacking and removing hacked content as soon as it appears

If you determine that your site doesn’t meet these guidelines, you can modify your site so that it does and then submit your site for reconsideration.

More Background SEO According to Google


SEO Experts

Some of the SEO “experts” we encounter exhibit less than tasteful attributes. Having been inspired by The Oatmeal, I’ve taken the liberty of illustrating some of the bad SEO professionals.

I understand that I might be slightly hypocritical in this post. I’ve found myself displaying some of the characteristics below.

Big Words/Acronyms SEO

You may have caught yourself using jargon that might be over your clients’ heads. These SEO Experts may know the trade, they just lack the ability to communicate effectively with clients. If you notice that your client has a blank look, you may want to ask if they don’t understand something you’re saying.

Your CMS is really bad for SEO. In fact I don't see a robots.txt or an XML sitemap. I recommend a mod_rewrite for your non-www version and use of the canonical tag. Your keyword density looks spammed and you have no ALT tag descriptions. No wonder you aren’t at the top of the SERPs.

Sparkling, White Hat SEO

Don’t get me wrong, white hat is the way to go in SEO. However, some incompetent SEOs may claim that their lack of linkbuilding is due to high moral principles. They put on a facade of “White Hat.” These SEOs either don’t get linkbuilding or they don’t want to put forth the effort to do it. Of course there are plenty of gray/black hat ways of building links. But other linkbuilding methods such as submitting a press release, creating quality linkbait, etc. are legitimate and great for SEO.

Just give me a few years and I'll get your website to the top. I'm completely white, sparkly hat. My SEO campaigns are based on moral principles. Link building is just not for me and my possy. If you have to work for links, then you're crossing the integrity line. Most of what I do is on-page... actually all I do is on-page optimization. But hey, I'm really gooood at it.

Conversion Expert Turned SEO

The Conversion Expert SEO is more uncommon. These SEOs somehow mistakenly believe that user experience is intricately connected with SEO. They carry the label of “Internet Marketer” but they only talk about conversion and design. The idea of increasing traffic rarely crosses their minds

Search marketing is a lot about user experience. So you've got 10 visitors a month? With my SEO experience, I'll boost your conversion rate by 10%. I'm sure we can convert at least one of them.

Self-Proclaimed Social Media Expert

The Social Media Expert may have been an SEO who was drawn into the hype of Social Media. He tells the world that he is a Social Media Expert because he knows how to use Facebook and Twitter. He frequently talks about “going viral,” although he has no idea how to go viral. And sadly, he measures his effectiveness based on the number of followers and friends he acquires.

We HAVE to get an account with every social network... I've already signed you up for SocialBunny, Diggggg.com, and Wuphf.com. If you get lots of accounts you'll get lots of links. Oh yeah, you're going to want a lot of followers and friends... it will definitely help your SEO.

IT Guy

SEO is an IT thing right? Wrong. Unfortunately, the in-house IT guy thinks that SEO belongs to the IT department since it involves algorithms and coding. While SEO may require someone with a bit of HTML skills, it should belong to the marketing department and more specifically a “Search Team.”

Well, I just googled SEO and I found out that keyword density is REALLY important. I’m just going to be adding our keyword, 'Organic Kitty Chow,' all over our website and in the keywords meta tag. We don’t need an SEO firm -- SEO is an IT thing. I’ll take care of it.

Bad Grammar Offshore SEO

We get plenty of emails and comments filled with poor translation grammar. Do you really want someone who’s using a translation tool to do your keyword research? These SEOs may be cheap but not very effective.

I has like blogging, and I always comment on other blogs to get links. I says things like 'Your write good posts, I am bookmarking this blog for return.' These kinds comments always get approval. SEO is really a lot as surgery. I has got thousands of comment links with my softwares. Please make business with me.

Black Hat SEO

The Black Hat SEO should be avoided like the plague. He may get you fast results, but those results will quickly decline after Google notices the spammy tactics around your website. Black Hat SEOs will always take the easy way out and often do more damage than good.

Give me your business and I'll get you to the top tomorrow. I've got plenty of doorway pages and I spin just about every article out there. Who needs to actually write unique content anymore? Don't worry about my techniques, they always work. I've only had a few hundred websites banned.

Skip to toolbar