SEO Friendly WordPress in 12 Steps

WordPress is fantastic, free, and open source. It can be installed, up and running in a matter of minutes. However, out of the box, WordPress is not as SEO friendly as you’d think.
This guide will show you the necessary steps to optimize a WordPress website, focusing on modifications that apply to SEO.

Note: These steps and screenshots were creating using WordPress version 3.5

Here are some quick links to jump to each step:

Step 1: Privacy Settings During Development

One of the last configuration settings in the infamous 5 minute install is the Privacy setting. It is preferred to uncheck this box during development. This will block your site from being indexed by the major search engines.

The last thing you want is Google to show your Sample page in the search results. When you are ready to officially launch, come back and check this box.

Search engine visibility WordPress setting

If you accidentally installed WordPress with this setting checked, don’t worry. You can adjust this setting at anytime under Settings >> Reading.

DON’T FORGET TO REMOVE THIS SETTING ON GOLIVE DAY!

Step 2: Permalink Settings – Search Engine Friendly URLs

You can configure search engine friendly URLs (SEF URLs) by selecting the Custom Structure option and adding the following parameter in the text box.

Custom permalinks

Step 3: Rename the Uncategorized Category

I cannot tell you how many blogs I read an notice that the blog is carelessly categorized under Uncategorized. Let’s fix that.

Visit your Categories setting page accessible from the Posts menu. Hover over the Category and select Quick Edit. Rename this to some sort of catch-all name that will suit any blog topic if you, your clients, or guest bloggers accidentally forget to select a category for their post. By renaming the Uncategorized category, you don’t have to adjust your default category under the Writing panel.

rename-uncategorized-wordpress

Step 4: Customized Page Titles & Meta Tags With the All in One SEO Pack Plugin

By default, WordPress will use your page or post title as the HTML page title tag value and doesn’t bother to include the meta description tag or meta keywords tag. There are a few other plugins that can help you customize these tags, like the Yoast plugin, however, I like the simplicity of the All in One SEO Pack plugin.

All in One SEO plugin

After you get the All in One SEO Pack plugin installed, head to the admin page. They conveniently remind you if you haven’t configured your settings yet.

all-in-one-seo-plugin-admin-wordpress

The admin page will allow you to configure global settings for the plugin as well as define the page title, meta description, and meta keywords tag settings for your homepage specifically.

  • Enable the plugin.
  • Enter your irresistable page title. Create an enticing page title using your targeted key phrases and keep it under 50 characters. Google truncates anything past 50 characters.
  • Enter an encouraging meta description. This is your chance to get searchers to click your search result over the others in the SERP. Optimal over 50 characters and under 146 characters.
  • Enter your targeted key phrases. Google publicly stated that they ignore this tag, but Yahoo still uses it as well as other search engines.

Custom page title and meta tags

Delete “ | %blog_title%” from the following fields. If you leave the default setting, your page and post titles will end up too long and cause an error. They should look like this:

post-page-title-format-wordpress

This is a cool feature added in a recent plugin update. You can enter your Google+ profile URL and the plugin will add the rel=author tag on all of your pages and posts for Google Authorship. This is how you get your profile picture to show up in the SERPs. (Note: You must add your site to the Contributor To section of your Google+ profile in order to complete the two-way authentication.)

gplus-profile-sitewide-wordpress

The first two checkboxes below will allow you to have control over the meta keywords used for your blog posts (not pages). You will use the Tags field when creating a post to define your keywords.

Allow the search engines to index your Category pages, but not your Archives and Tag Archive pages. If you do, you end up with a bunch of low quality pages in the index.

Finally, I like to write my own meta descriptions, so I uncheck that last box, however, if you are passing this blog off to your client, or have guest bloggers posting on the site, you might want to leave the Autogenerate Description checkbox checked as they most likely will not write them on a regular basis.

no-index-categories-tags-wordpress

Now that you have the main settings customized, let me show you what the panel looks like when you are working on a Page. (Note: Ignore the character suggestions below and take a look at my pro tips referenced earlier. Not all search engines will acknowledge that many characters.)

custom-meta-tags-wordpress

Step 5: Reduce 404 Errors With the Category Pagination Fix Plugin

There is a bug with WordPress where you will end up with 404 errors when trying to access any Category page beyond the first page. If you include the Category Pagination Fix plugin as part of your install process, you can avoid Google & Bing webmaster tools from ever reporting the 404 errors. There is no additional configuration, simply activate and move on…

Step 6: Remove Unnecessary Links to Images With the Remove Link URL Plugin

remove-link-url-wordpress

By default, WordPress automatically links images you upload and insert to themselves. This is unnecessary and results in a dead end if a user clicks on an image. It will open the image leaving the user with no navigation and they are forced to click the browser’s back button.

Linking your images to themselves might be necessary if you have some sort of Lightbox pop-up feature. Let’s kill the link for now. Install and activate the Remove Link URL plugin. No additional configuration required.

Step 7: Autogenerated HTML & XML Sitemaps

google-xml-sitemaps-plugin-wordpress

You can have your web pages and blog posts included in the search index much faster if you provide the crawlers or bots access to HTML and XML Sitemaps. I recommend the Simple Sitemap plugin to generate the HTML Sitemap and the Google XML Sitemaps plugin to generate the XML Sitemap.

Just use the basic configurations on these two plugins, nothing fancy required unless you are an advanced XML Sitemap expert. After you have these two Sitemap plugins installed and you have officially launched your site, you will want to submit your XML Sitemap to Google & Bing. I have created and posted an extensive step-by-step XML Sitemap submission guide on my blog that will help you with that process.

Step 8: Google Analytics for WordPress

google-analytics-for-wordpress-plugin

There are a ton of ways to add Google Analytics website tracking code to your website; I personally like the Google Analytics for WordPress plugin because it has the option ignore Administrators.

Configuring this option allows Administrators to work on the site without skewing your reports. (Note: If you are developing your site on a testing server, you will want to wait until you launch the site before activating this plugin.)

Either use the button or manually enter your UA code to authenticate with Google

google-analytics-for-wordpress

Click the checkbox to show advanced settings

show-advanced-settings

Use the drop down list to select Administrators

ignore-administrators-wordpress

Step 9: Responsive Theme

When you are selecting a premium theme from ThemeForest.net, ElegantThemes.com, TemplateMonster.com, or any of the other hundreds of theme directories, consider selecting a theme with a “responsive” design. A responsive theme will automatically adjust the layout, menu items, and images to render nicely on desktops, laptops, tablets, and mobile devices.

Google suggests a responsive design to encourage a better online experience (and they’ll give you a boost in the SERPs if you do so). Google has also been testing out an addition to the SERPs. When users are searching on mobile devices, Google will show a mobile icon indicating to the searcher that those sites are mobile friendly. Make sure your website is as accessible as possible.

responsive-wordpress-theme

Step 10: Review & Optimize Your Theme Code

After you have selected your WordPress theme, you will want to analyze the theme’s code for technical optimization errors. I like to use the SEO Doctor Firefox add-on to get a quick look at my pages.

With this Firefox tool, you can quickly get a review website optimization opportunities. We cannot expect awesome web designers who create amazing themes to be SEO experts. A common flaw in WordPress themes are logos wrapped in an H1 tag. This causes multiple H1 tags to be present on every page.

Here is an example of what the SEO Doctor add-on will show you:

seo-doctor-wordpress

Step 11: Optimizing Your .htaccess file

The .htaccess file is the first set of code the server reads before it loads any web pages. Here you can define a set of rules. This file is very powerful and can also crash your site in a split second if you have any errors in the file, so be careful, backup and test.

Canonicalization: Force the www or non-www Version of Your URL

Make sure that your website is not able to load both the www and the non-www version of your URL. Google will count each web page, even though they are technically the same, as two different pages and give your site a duplicate content penalty. You can force one version over the other by adding a few lines of code to your .htaccess file. The redirect must be a 301 permanent redirect. A simple meta tag or javascript redirect will not work.

I personally like to use the non-www version because it’s shorter. The example code below can be added to your .htaccess file and will force the non-www version. If you want the www version, here is a link to example code.

htaccess-301-non-www-wordpress

Increase the Speed of Your Site by Caching Your Website Files

mod-pagespeed-wordpress

mod_pagespeed speeds up your site and reduces page load time. This open-source Apache HTTP server module automatically applies web performance best practices to pages, and associated assets (CSS, JavaScript, images) without requiring that you modify your existing content or workflow.

Check with your hosting provider to ensure mod_pagespeed is supported and if it is, activate it. Then add the following code to your .htaccess file.

mod-pagespeed-htaccess-wordpress

Force Trailing Slash on URLs

Another trick to avoiding a duplicate content penalty is to force a trailing slash on all of your URLs. You can do so by adding the following code to your .htaccess file.

force-trailing-slash-wordpress

301 Redirect URLs

Often times, WordPress is replacing a static HTML website as an upgrade. You will need to properly 301 redirect all of the old URLs to the new ones so you don’t end up with a bunch of 404 page not found errors. A quick way to get a list of URLs already indexed by Google is to search “site:yourdomain.com”. And sometimes you just need to redirect a broken link.

Here is sample code to properly redirect URLs using a 301. (Note: Notice the first part of the code does not include the full URL, but the second part does.)

htaccess-301-redirect-wordpress

Step 12: Optimizing your robots.txt file

Crawlers or bots will scan web pages on your site for inclusion in the search index, but they will check your robots.txt file first for any instructions. The User-agent: * specifies that all crawlers are welcome.

You can help the crawlers learn about all of the pages on your site by telling them where your Sitemap is located. You can also tell the crawlers which pages and directories you do not want included in the search index.

I like to Disallow a few directories that are standard in every WordPress install. There is no need for any content within those directories to ever show up in the search engine results pages (SERPs). Block them using the example code below.

Optimize robots txt file

Final Thoughts

These WordPress website optimization tips only scratch the surface. There are hundreds of amazing plugins to further optimize your site. Additionally, each website is unique and might require different SEO strategies.

I’d be curious to see what SEO tips you might have to offer to optimize WordPress. Use the comments section below to share your suggestions. If you have any WordPress or SEO questions enter them below and I’ll respond as soon as possible.

Viral video basics

A long time ago in a galaxy far, far away, the gold standard of marketing was to have an ad during the Super Bowl that people would talk about around the water cooler when they got to work the next day.

That last sentence is sort of like that old kid’s game “How many things can you find wrong in this picture?” Let’s see:

People don’t really talk around a water cooler that much anymore.
People also don’t wait a day to talk about a hot subject either. Instead, they discuss it as it happens on Facebook chat, or text, or email, or. …
More than a few people don’t go to work from 9 to 5 anymore, instead relying on mobile technology to allow them to work at home, on the road, inside a Starbucks, or. …
People also don’t wait until the actual game to view the commercials, as many such ads are “leaked” online well before the game to create buzz.

So, yes, the rules of marketing have changed, and radically. Why? It’s the same thing that has changed everything from how books are sold to how health care is delivered: the Internet.

As a result, the gold standard of marketing today isn’t the funny Super Bowl ad, but rather the viral video. Create one of those and you’ve struck Internet oil.

All of which, of course, begs the question: Can you make a viral video on purpose, or is it something that just happens naturally — sort of like taking a video of your sons and posting it online with the title “Charlie Bit My Finger.” Can you capture that magic and disseminate it on purpose?

Yes, it turns out you can.

Or maybe a more accurate analysis is that you can create the context for a video to go viral, give it your best shot, and then cross your fingers and hope your video is sprinkled with fairy dust.

Here’s how:

Create the right sort of video

Why don’t my small-business videos go viral? Because I didn’t create them that way, nor did I intend for them to be viral videos. They are simple business how-tos on my site TheSelfEmployed. They serve their purpose.

Viral videos are different. Viral videos tend to fall into a few different categories. So if you want to create a video that might get spread around the Internet like the Ebola virus, the first thing to do is make sure yours fits one of these categories:

Funny: A video begins to go viral when people not only watch it, but tell their pals about it, too — sharing it via email, Facebook, YouTube, text, Twitter, whatever. And what causes people to want to share a three-minute clip? It’s because they think their friends will like it/laugh at it/be amazed by it too, or they think they will get some street cred by sharing it or discovering it first.
The first way to make that happen is by posting a video that cracks people up. How do you do that? That’s your job, amigo — I’m just sharing the process.

Weird/quirky: When I first saw “PSY – Gangnam Style,” I, like many, couldn’t take my eyes off the screen. It was just so … different. Weird works. Quirky kills. Posting an odd video ups the chances of yours going viral because odd is memorable — and especially in this e-world, you have to do something different to get noticed.
By the same token, larger companies have been able to get their videos to go viral because they have a budget to create a special effect that amazes at first sight, like an outfielder scaling a wall to catch a ball.

Cute: Kitten videos are all the rage because kittens are just so damn cute.
Outrageous: That a stupid movie about the Prophet Muhammad could cause rioting half a world away shows the power of posting something outrageous online. Now, if you are in the small-business world, outrageous is probably not what you want. But it does need to be noted that outrageous is a viral player.
Poignant/insightful: A professor’s last lecture. Steve Jobs speaking at graduation. Sincere insight is another tried-and-true way to position a video for virality.
Post it right
Once you have the video, the trick is to choose a thumbnail that’s captivating and a title that’s intriguing.

Distribution
As indicated, having the right video is only a small part of the viral video game. There are tons of great videos out there that no one sees. So how do you get people to see yours? There are strategies large and small:

Begin with your network: Once posted, share your video with those in your extended network and ask them to share it with theirs. Tweet it. Post it. Blog it. Email it.
 
Share it with influencers: There are bloggers out there with huge audiences. Writers with many readers. Sites with millions of visitors. Your job is to get your video in front of them and get them to share it with their followers and fans.
Finally, remember that viral videos tend to be short (three minutes or so) and make people feel something — they laugh, they cry, they get mad, they worry. 

And then they share.

What is online authority?

Google is quickly learning how to mirror offline authority.

In the offline world how often we talk is less important than what we have to say. Offline authority is based on your ability to influence, to change people’s mind-set, to inspire people to take action, and to change the world.

The AuthorRank algorithm will likely help Google to reflect offline authority better in the future because an online article’s status will also be influenced by its author’s standing.

What can make you influential? A few examples:

  • Discovering the God particle or a cure for malaria;
  • Sharing details of your dysfunctional life that speak strongly to your audience;
  • Being insanely helpful and inspire your audience to change their mind or to take action.

The last example is — of course — a content marketer’s territory. An influential content marketer is both incredibly helpful and persuasive. He’s passionate about his field and generously shares his knowledge. But he might be divisive, too. He might speak strongly to a certain tribe of people, while others may not like his writing. His writing has personality and a recognizable voice.

How can you become an influential content marketer?

Steal ideas from different sources

All creative work builds on what came before.
~ Austin Kleon

The web feels like a giant echo chamber. Everything has been said, right?

It may feel impossible to create original content, but you can certainly do it. You can stop regurgitating the same information; and create unique content.

How?

Create your own mix of sources. Don’t steal all your ideas from one or two bloggers. Steal from sources inside and outside your niche. Steal from writers, philosophers, scientists, musicians, and everyone else who inspires you.

Smart content marketers have a swipe file with inspirational quotes, fascinating ideas, and other stuff they like. Don’t use your swipe file to imitate. Don’t outright copy. Give credit to your sources, and let yourself be inspired by a multitude of ideas.

Develop your own voice

When I first started out in cartooning, I used to copy Hagar the Horrible. And my work was a replica of Hagar the Horrible. But then I added other cartoons, like Dennis the Menace, Calvin and Hobbes etc. And my work became my own.
~ Sean D’Souza

Who are your heroes? Which writers or bloggers speak directly to your soul?

To create your own voice, analyze the writing styles and techniques of your favorite writers:

  • Why do their headlines grab your attention?
  • How do they draw you into their blog post?
  • How do they structure their posts?
  • How do they inspire you with their closing paragraphs?
  • How do they use copywriting techniques like analogies, trigger words, and cliffhangers?
  • How do stories make their articles memorable?
  • How do they bond with their readers?
  • Why are they fascinating?

You craft your own voice by studying your heroes and stealing their techniques. Don’t slavishly copy but learn from them. Pick the techniques you like from each of your heroes to create your own personal style; and let your passion shine through.

Become a better writer

The easiest way to create new content is recycling existing information.

Change a how-to article into a how-not-to article. Turn a how-to into a list post. Turn your list post into an infographic. Combine two (or three) posts into a new post.

Recycling provides a quick way to create loads of content, but it won’t make you a better writer; and it probably limits your potential to be influential. You need to push yourself to get better each time you write. You need to try new writing techniques, tackle more challenging topics, and link different ideas together.

To become a better writer you need deliberate practice. You need to step up the quality of your content; and fully engage your brain when writing. Introduce new analogies. Tell different stories. Try another angle or structure. Take your time to experiment. Quality content requires time.

You need to get out of your comfort zone to become a better writer.

Write crappy first drafts

Almost all good writing begins with terrible first efforts.
~ Anne Lamott

Combining new ideas and writing original content is hard; it requires writing, rewriting, and rethinking.

Of course, exceptions exist, but for most of us, writing original content is difficult. You might be beating yourself up because you’re not writing fast enough; and because you’re not producing enough content. But this idea that you need to speed up your writing could be killing your creativity.

Writing crappy first drafts is normal — especially if you’re pushing yourself outside your comfort zone. But how do you deal with crappy first drafts? Look for these issues:

  • Your best idea is often buried. Try and find it, and move it to the beginning of your article.
  • Your draft is heading in myriad directions. There’s too much is going on. Go back to your one big idea: How do you want to inspire people? Cut everything that’s irrelevant.
  • Bad grammar, spelling mistakes, and redundant sentences. Don’t worry about this until you’ve sorted out the flow of your content.

Writing a crappy first draft may feel pointless. Why write something you’re going to change or delete? Don’t worry. Crappy first drafts can lead to something magical.

Create inspirational conclusions

How can you influence people?

Stop thinking you’re merely sharing information. You’re creating content to inspire people.

Too often online articles fizzle out, because the author gets tired and lacks inspiration. Rather than write your conclusion last, why not write it first?

Writing inspirational paragraphs that teach is one of the most important skills a persuasive content marketer can develop. Study how your heroes inspire you, and steal their techniques.

The truth about building authority

Writing influential content is tough. It requires hard work, creative thinking, and original writing. And most of all it requires passion for your subject.

Don’t just echo what others are saying. Have your own voice. Dare to be different. Research like crazy. Speak up. Share your opinion. Back up your opinion with studies and statistics. Let your personality shine through.

Remember: one epic post can have more impact than one hundred recycled posts. Stop acting like a blogging machine. Quit guest posting like a mediocre maniac. Instead, take your time to rethink and rewrite your posts.

Write less. Read more.
Talk less. Listen more.

Quick SEO Wins – What Can You Do With an Hour?

In many ways, 2012 was the year search engine optimization (SEO) really grew up. Google left us with little choice. But we evolved as an industry, striving to build high-quality content and focus on adding as much value as possible for our users, readers, and customers.

There was a lot of pain along the way throughout this growth process, and many are still cleaning up from the aftermath. For example, the gaps in Google’s algorithm have been closed to the point that any quick SEO tactic is likely to be just that — a short-term fix at best, a potential cause of long-term damage at worst.

If you want to win in 2013, you must commit to a solid long-term strategy. However, that’s not to say you can’t build small wins into your long-term strategy to assist in developing brand strength.

The only real way to beat the well-known online brands is to become one yourself. But just because the more traditional SEO tactics from previous years are now areas to avoid in 2013, doesn’t mean there aren’t any quick wins out there.

Research and Analysis Wins

Research and Analysis Wins

  1. Make a cup of tea and read through Google’s patents to figure out where they’re going next!
  2. Not a fan of tea? Take your client for a beer instead. Find out what else they’re working on outside of your campaign. There’s a good chance you’ll find information that can be of great assistance.
  3. Spend time with your dev team. Get to know them and figure out their difficulties and bottlenecks.
  4. Build a checklist scorecard of how well you know outreach targets, and map out stages of how you you can get to know them better.
  5. Find the top bloggers in your local city or region. Figure out where they hang out so that you can meet them (events, don’t stalk them on the way home).
  6. Even better, arrange your own meetup and invite key bloggers and journalists.
  7. Run an industry survey to collect data, which can then be used for content production.

Google+ and Authorship Wins

Google+ and Authorship Wins

  1. Set up Google Authorship for your site, or if you have already, verify that it’s working.
  2. Arrange a seminar to demonstrate how your team can link up their Google+ author profiles.
  3. Encourage your team to build up their Google+ profiles and share other content.
  4. Have a blog? Find orphaned posts from ‘guest bloggers’ and move them under the relevant author profiles where you can map them up – same for ex-staff members.

Events and Sponsorship Wins

Events and Sponsorship Wins

  1. Find local sports teams and events you can sponsor.
  2. Submit a speaker pitch for an industry event.
  3. Create a content plan around liveblogging an industry event.

 

Content Strategy and Planning Wins

Content Strategy and Planning Wins

  1. Be creative – brainstorm new product ideas to make them link-worthy.
  2. Acquire a company!
  3. Get sued! (Ok, maybe don’t.)
  4. Open new content angles by relating your niche to a different topic.
  5. Dig into your data – look over old company reports, white papers, etc. for key statistics.
  6. Schedule a team brainstorm to develop new content ideas.
  7. Share a Google Calendar mapping out your website content plan with auto-send reminders to the people involved with different areas of content creation.
  8. Aim to split opinions with your content and promote this to both audiences for a reaction. It doesn’t have to be controversial, just as long as you have no right or wrong answer.

 

Relationship-Building Wins

Relationship Building Wins

  1. Follow up! Make the effort to stay in touch with key contacts via social, email, phone, and face-to-face.
  2. Become friends with key influencers and meet them in person – invite them to lunch or arrange a meetup.
  3. Reuse great outreach connections and build on-going relationships.
  4. Provide customer testimonials for your software, product, and service providers.
  5. Ask influencers to add you to their partner pages.
  6. Get your users involved. Build relationships and brand interaction by rewarding loyal fans.
  7. Ask your friends and family to link to you if they have personal websites.

 

Content Production Wins

Content Production Wins

  1. Take photos and make them available under a Creative Commons license.
  2. Write topical content about a trend from that day.
  3. Interview experts within your niche (ego bait).
  4. Craft an exclusive content pitch for a leading authority website.
  5. Focus on one piece of great content, instead of four average ones.
  6. Develop a content ideation plan for an infographic using data from within your company.
  7. Answer common questions asked within your industry. This can be a great method for brainstorming content ideas.
  8. Set up Google Alerts for these questions so when these are asked on blogs or forums you can reply with your opinion (and link to your content).
  9. Create and syndicate video content to target new audiences.
  10. Offer a discount promotion to get people talking, and so they are picked up on promo-code websites.
  11. Create a great 404 error page. Your users will love it, and it might get you some links for creativity!

 

Comment Marketing Wins

Comment Marketing Wins

  1. Make the effort to reply to comments on all of your content (posts on your blog, guest posts, news coverage, etc.) to help build relationships.
  2. Find an active forum within your niche to start contributing and building profiles.
  3. Find two targeted blogs; comment where you can add value, and subscribe or follow them and their key writers on Twitter.

 

Local SEO Wins

Local Wins

  1. Take photos of your office, or store and upload a minimum of six high-quality photos to help your local listing stand out.
  2. Ensure all of your company addresses are registered and up to date on Google+ Local.
  3. Design business cards that encourage customers to review your brand on Google, TripAdvisor, Yelp!, etc. Offer them a next-time discount incentive.
  4. Create landing pages with local intent. Submit them to Google Local if you have multiple addresses.
  5. Take a visit to your local library and dig deep into historical information about your local town or city. You might find a gem that hasn’t been written about online, which is great for picking up local citations and links. Michael Dorausch has some great tips in a local review of Tampa.
  6. Create a list of power reviewers within your sector and look to get on their radar.
  7. Make sure you’re listed on the key local providers, such as Yell, ThomsonLocal, ReachLocal, etc.

 

Blogger Outreach Wins

Blogger Outreach Wins

  1. Find two of the most authoritative bloggers in your niche and figure out the best way to connect. In all likelihood, they’ve added that to their “About” page.
  2. Pick up the phone and speak to top influencers.
  3. Hire writers within your niche and leverage their contacts for outreach.
  4. Spend your hour carefully crafting a great content pitch and make it personal and original.
  5. Give bloggers and journalists a product they can use or test in exchange for a review (use with caution, and make this well-targeted and selective).
  6. Write for authority sites within your niche. Build strong relationships and strengthen your reputation by leveraging the audiences of well-respected industry blogs.
  7. Get your client to create a company email address for you with their domain so it’s clearer when you’re sending emails out on their behalf.

 

Penguin and Panda Penalty Review Wins

Penguin and Panda Penalty Review Wins

  1. Perform a content performance ratio analysis to figure out how much content you need to clean up.
  2. Build a list of top sites you want to remove links from.
  3. Clean up your own internal anchor text from over-optimisation.
  4. Reduce cross-linking from other sites you own. Link from partner pages, rather than sitewide.
  5. Clean up any links from your social profiles and author bios.

 

On-Site Optimisation Wins

On-Site Optimisation Wins

  1. Prioritise a list of key actions, and assign responsibilities and deadlines.
  2. Install SEO WordPress plugins to optimise your blog.
  3. Optimise page title tags. It’s the oldest quick SEO win in the book, but it still works!
  4. Test pay-per-click (PPC) ad copy as title/meta description to lift organic click-through rate.
  5. Review navigation structure such as breadcrumbs and internal linking.

 

Productivity Wins

Productivity Wins

  1. Use tools. I could fill another 96 points here, but I suggest discovering which two or three make you more efficient. Then spend the time to really get to know how to use them fully.
  2. Start to document your delivery process. This will save you a lot of time in the future when training your team.
  3. Run a knowledge share session with your internal team and/or client. Try to make sure they know what you do, then you don’t have to do it all yourself!
  4. Learn how to get things done. Watching this video takes 45 minutes — you’ve still got 15 minutes left. So you’re instantly more productive!

 

Link Removal Wins

Link Removal Wins

  1. Analyse your backlinks to find off-topic, poor anchor text links.
  2. Analyse the market to find the percentage of exact-match anchor text for key competitors.
  3. Clean up any obvious paid or over-optimised links.

 

Link Reclamation Wins

Link Reclamation Wins

  1. Google Operator Query “Brand Name” “Key Person Within Organisation Name.” Then ask to be credited with a link within article if not already given.
  2. Google Image search for your infographics or photos. Ask for link credits where these are not provided.
  3. Find broken links pointing out of site with this tool. This is another easy fix to clean up and makes your site look good.

 

Competitive Analysis Wins

Competitive Analysis Wins

  1. Research competitors’ top pages in OpenSiteExplorer to get content ideas.
  2. Review brand traffic history in analytics to measure the impact of offline brand signals.
  3. Think of creative ways you can get more people searching for your brand by joining up with offline advertising.

 

Public Relations Wins

Public Relations Wins

  1. Bring your PR and social teams together to understand what they are doing and how they can help each other by working more closely together.
  2. Sign up for PR service HARO (Help a Reporter Out) to become a link source for news articles.
  3. Build a list of key journalists and media contacts you want to influence.
  4. Enter relevant industry and local awards competitions.
  5. Brainstorm ideas to connect offline with online tactics. How can you get more people searching for you (sending brand signals to Google)?

 

Technical Wins

Technical Wins

  1. Create and submit an XML Sitemap to keep this up to date in Google Webmaster Tools.
  2. Check how your website displays on mobile and tablet devices, and plan to create platform-specific sites if you don’t already have them.
  3. Fix duplicate content homepage: www vs non-www.
  4. Check for external duplicate content.
  5. Make sure the geo-targeting for your international site is set up correctly in Google Webmaster Tools.
  6. Optimise your site for mobile. Do it in less than an hour with a plugin.

 

Analytics and Measurement Wins

Analytics and Measurement Wins

  1. Start developing a strategy around your top 20 PPC spend/converting keywords.
  2. Find your top converting landing pages. Optimise them and update the content to attract new links.
  3. Ensure all your analytic goals are in place to measure both micro and macro conversions so you can report on revenue, leads, and ROI to your boss and/or client.
  4. Talk to your clients and learn about their main business.
  5. Finally…spend your hour writing a report making a business case for the value of SEO to get extra resources allocated. Because really — an hour’s just not enough!

Fun Quiz

robots.txt

SEO Master I

First instalment of the master series of SEO training questions.

Ultimate SEO Course

In order to move you into the front seat of internet marketing we have constructed this course in a number of video, image and text formats to best equip you to answer the hard questions regarding search marketing.

On every page of this site we have a BIG BLUE PENCIL that will link you back to this page. This page will have the latest industry updates and information.

At the end of this course you will be required to write a short test multiple choice test to validate your newly acquired SEO skills.

SEO Cheat Sheets

Colours and Consumers
Load Speed
Content is King
History of Search and SEO
Link Building Cheatsheet
Google Panda Cheatsheet
Top Web Directories
How to Kill Your SEO Rankings
Advanced Link Building Techniques
SEO Course
Glossary
Google Basics
Search Engine Optimization (SEO)

How to Rank #1 on Google and Impress Nerdy Chicks!

Lesson 7

Becoming an SEO EXPERT

seo-expert

Once you’ve finished this book, which is the first portion of this SEO course, you should consider yourself an Intermediate SEO.

So far, this book has taught you much. You now know all about keywords. You know what they are, how to find them, and where to place them. You’ve learned the basics and the importance of incoming links. You now know *more* about the nuances of ‘Link Popularity’ than probably any of your competitors. In fact, just knowing that some links are better than others, and how to tell the difference between natural and artificial link structures, puts you way ahead of some of the so-called “Pros” in the SEO business – believe it! You also know how to evaluate the quality of an incoming link. You know there are certain types of Web pages you should avoid linking to because they can hurt your rankings. And, you even know what the ultimate, the very best link, looks like; ideally the only link on the homepage of an authoritative, high PageRank site. And, although such a perfect “10” link isn’t often realistically achievable, this standard gives you the litmus test by which you can evaluate all other links against. You know about buying links. That purchasing some links can get you penalized by Google while others will help you rank better. And where those links appear on the linking page, matters!

You are now armed with at least 18 actionable linking sources and strategies:

  1. Links from mainstream directories like Yahoo and DMOZ.
  2. Specialty directory links within your niche market.
  3. Professional services links from suppliers and merchants with whom you do business.
  4. Links from Associations where you are an alumnus or a member.
  5. Links placed within your published ‘expert articles’ and syndicated to other sites.
  6. Press Releases that get indexed and build links.
  7. Links coming from testimonials you place on other people’s sites.
  8. Links from your forum comments you place on other sites.
  9. Links negotiated from learning who’s linking to your competitors.
  10. Reciprocal links, when they are on topic, make sense and look natural.
  11. Blogging to attract links.
  12. Buying abandoned Web sites with pre-existing links.
  13. Buying ads in Ezines and newsletters to build long-lasting links.
  14. Special ways to procure those coveted links from .edu and .gov domains.
  15. Providing specialized tools and resources to attract links.
  16. Using your affiliate program to gain links.
  17. Social networking to build links.
  18. Linkbaiting via social media – viral marketing.

You know the importance of hiring (or being) a good writer who can ‘write’ your Web pages to the top of the search listings, and how to smartly syndicate those articles without running into duplicate content issues.

You’ve learned that:

  • The number of links on a page matters.
  • All about run of site links, what they are and how they can hurt your ranking efforts.
  • The importance of maintaining consistency in your link structure.
  • How anchor text dictates what keywords that Google and others *think* your pages are about.
  • Why getting links to your ‘deep’ pages is better than having links only to your homepage.
  • What types of links to avoid.
  • That PageRank measures a site’s importance as reported by the Google Toolbar.
  • There’s a rough 80/20 ‘link balance’ that is factored into a page’s importance ratings.
  • It takes MORE than having a great site with great content to dominate the top rankings—you know it takes calculated strategy and implemented tactics to build links to your great site with great content. Only then can you realistically expect to dominate the top rankings!

In Lesson 4 you learned about the indispensable SSI tool that evaluates the strength of a site in seconds. And, you learned how to spot the weak sites that you can beat in the rankings as well as how to identify sites that are too strong to compete with for very specific keywords. You also learned there are safe and unsafe SEO practices and that some so-called “brand name” sites are white listed—working under a different set of rules.

Lesson 5 gave you everything you need to know about choosing the right Domain Name. You learned that the right name can, not only help your rankings, but can also help you get more clicks from people who recognize the keywords they’re searching for within your URL. These are trusted domains from the SEPOV. You also learned which types of domain names to avoid (.info, .biz, too long, too many dashes, double dashes) because they look spammy and the engines tend not to trust them.

In Lesson 6 you learned the details about how to build a search-friendly Web site while avoiding the mistakes that will handicap your rankings. You learned about a specialized file (robots.txt) and the importance of managing the spiders that crawl your site looking for content to index. You learned that it is VERY bad to change URLs or move Web pages, so planning your site architecture is crucial. And, you learned that if you DO have to move a site, then you’d better study the corresponding Advanced SEO Tutorial that teaches you EXACTLY how to do it so you don’t lose all of your hard SEO work to date. Yes, indeed. Assuming you’ve studied your lessons well, you have all of this knowledge and actually much, much more, at your fingertips to apply to your Web site or your SEO business — or both!

All of this knowledge does, indeed, elevate your level to *at least* Intermediate SEO! Now it’s time for you to move forward… to the Advanced SEO Level.

Becoming an Advanced SEO

seo

As you know, Lessons 3, 4, and 6 of this book referred you to Advanced SEO tutorials — there are 22 of them …and they are now ALL available to you inside the membership area of Cadabra.co.za. These are the tutorials you will need to *master* before you’ll confidently call yourself an Advanced SEO. These 22 Advanced SEO tutorials contain the fine tuning that’ll give you the confidence to command control of your own Web site’s search engine optimization efforts. Then, with a little experience under your belt, we suspect you’ll feel comfortable working on other people’s websites. These are the tutorials that, once mastered, qualify you to even *charge a fee* for your SEO consultation services if you desire to do so. You may be surprised to learn how much demand there is for truly good SEO services and advice these days.

Becoming an SEO Expert

  1. Experience you will gain by doing.
  2. Success you will gain over time after you begin doing.
  3. Competitive Knowledge – The knowledge and competitive intelligence that is typically gathered and compiled by the SEO ‘tools of the trade’ that are zealously used by the professionals. These are the tools that immensely accelerate the process of crunching the numbers while doing site analysis AND recognizing trends, as you build your own professional SEO intuitiveness.
  4. Advanced Mastery – The additional mastery of PPC marketing is, arguably, the final feather in the Expert SEO’s graduation cap. Yes, it’s true that some SEOs focus only on the ‘organic’ rankings while others specialize in nothing but PPC. A great many SEOs do both; and they are able to successfully coordinate the two working in tandem. Regardless of the path you take, you’ll have the building blocks in place to excel in PPC marketing once you’ve built your foundation on ‘organic’ SEO strategies, tactics, and techniques.

Lesson 6

Site Architecture; Making Your Web Site Easy for Search Engines to Index

Website Architecture

Now that you know the importance of Keywords, Links, Domain Names, and Competitive Analysis, it’s time to understand how your Web site should be set up so that search engines will find it, and list it within their index. After all, the search engines can’t rank your pages at the top of the search results if they don’t know about them.  By the way, Indexing is what search engine spiders are doing when they crawl a Web site to collect data about your Web pages. They store that data in a database called an index.

Processing and storing a Web page is referred to as indexing. Therefore, you must ensure your Web pages are as easy for search engines to access and index as possible. Believe it or not, many Web sites are inadvertently or accidently configured to be difficult for search engines to access. And, …some Web sites are actually blocking search engines while their Web site owners are wondering why their site isn’t doing better in the search rankings!

Little do they know it’s because of the indexing-roadblocks they’ve inadvertently placed in the way.

So, pay close attention because here’s where you will learn:

  1. How to avoid the common mistakes that keep Web pages from being indexed, and
  2. How to ensure that all of your important pages get indexed properly to give them a good chance to rank at the top of the search results.

Remember though, making your site search engine friendly, by itself, won’t propel you to the top of the rankings. A search engine friendly site is actually more about avoiding the mistakes that will prevent you from getting indexed or damage your search engine rankings. To achieve that top rank, however, you must totally understand the critical role that keywords, inbound links, overall site strength, etc. play as you use this understanding to your advantage when designing your site.

There are always two important points to remember about search engines and how they relate to your Web site:

  1. The first point is: the quality of your site counts. Search engines make their money through advertising. Showing ads to their users is their profit model, and the more users they have, the more money they make. The way a search engine gets more users is by providing the best search results. This means that, if your site is the most useful site to customers in your keyword category, then search engines want to rank you at or near the top of the search results. Indeed, their revenue stream depends on it.
  2. The second point to remember is: search engines spiders are really just computer programs. More precisely, search engines run a program called a spider that:
  • visits your Web site,
  • reads the text and links on your Web pages,
  • then decides what to do with your Web pages based on the information it reads. They call this activity crawling your Web site. So, search engine spiders are computer programs that crawl Web pages.  And, if you’ve ever used a computer, you know that computer programs break sometimes, especially if you overtax them. You may have noticed that your own computer starts to slow down and may even crash if you have too many applications open. It’s the same with a search engine spider.

If your Web site is laid out in a confusing and disorganized fashion, or if the links between your pages are difficult for a search engine spider to find, your site is not going to be crawled as efficiently as you would like. This means that some of your pages will get missed. It also means your site won’t be crawled very often and your listings won’t be fresh. That puts you at a disadvantage when it comes to getting new pages indexed; and if your pages don’t make it into the index, they certainly can’t be ranked highly. Remember, there are billions of Web pages on the Internet. And search engines have to make the most of their available time and resources to crawl all those pages. It’s your job to make sure crawling your pages is quick and easy for the search engine spiders. Otherwise, you risk having your Web pages ignored by the search engines. Remember, this lesson is focused on making your site spider-friendly. The tactics and strategies covered here won’t rocket you to the top of the search engines (you’ll need to use incoming links and keyword strategies to do that), but they will help you avoid the mistakes that can nuke your rankings by locking you inside the starting gate. In other words, if it’s difficult for search engine spiders to crawl your site, you’ll be handicapped in the ranking race regardless of all your other good efforts.

Keep Your URLs Simple

Search engine spiders find the pages on your Web site by following links. They work in similar fashion to the way you use your browser—only much more quickly. They download a page, scan it for links, and store those links in a list. Once they’re done scanning the page, they grab the first link from the list and repeat the steps until they’ve followed all of the links one by one. Of course, this is a simplified explanation, but it essentially defines the process of how a search engine finds Web pages. Many Web sites, especially e-commerce sites, use dynamically generated URLs.

These are Web addresses that look something like: 

http://domain.com/product-id5-cust8&+8=7?gtf.

These dynamic URLs are automatically generated by pulling variables out of a database to match the product specifications a customer is looking for. Dynamically generated URLs usually contain lots of non-numerical, non-alphabetical characters, such as ?, &, +, and, =.

For example, a site that sells Hawaiian muumuus might have a page with the following dynamically generated URL:

http://yoursite.com/index.php?item=muumuu&color=blue&size=large

This is opposed to a static-looking URL, which is a bit easier on the eyes:

http://yoursite.com/muumuu/blue/large

Although most search engine spiders are capable of crawling these long and confusing, dynamically generated URLs, it is best if you can avoid using them at all. When all else is equal, a Web site with short, static-looking URLs is more likely to achieve a higher number of pages indexed by the search engines than a comparable site that produces dynamically-generated (DG) Web pages. Dynamic URLs are often the source of duplicate content which can affect your sites ranking.

Many content management systems (CMS) with dynamic URLs generate the same content reachable via multiple varied URLs. It is important to avoid getting those duplicate pages indexed if possible. You may need to make use of the canonical tag to tell the search engines which is the URL to index, and which ones to ignore. For more information on the use of the canonical tag read – Are you using the Canonical Tag Correctly? However, there are times when the advantages of DG pages outweigh the SEO drawbacks. So, IF your site absolutely must rely on dynamically pulling content from a database to create its URLs, it’s still possible to have your URLs appear static by using a tool like mod_rewrite. A mod_rewrite is a tool that can be used to rewrite a dynamic URL as a static URL on the fly. This is commonly done for SEO purposes to improve the pages navigation for spiders. When you are ready to apply this Advanced SEO tactic, be sure to study our extensive mod_rewrite

Advanced SEO Tutorial:

Getting Your Dynamic Sites Completely Indexed with mod_rewrite

By the way, if the tutorial mentioned above seems a bit too complicated for you, then share it with your web or tech people and have them handle the details of turning your dynamic URLs into search engine (and people friendly) web addresses. This is something that you definitely want to get right! And, even though it’s a little complicated, it’s worth the effort as it magically renders your complex, dynamic, ugly looking URL’s into simple links that search engine spiders and PEOPLE just love to follow. In essence, you get to eat your cake and have it too.

Meta Tags: Do They Matter?

meta-tags-seo

Meta tags are non-displayed text written into the source code of your HTML document intended to describe your page to the search engine for the purpose of cataloging the content of your page. A considerable amount of undeserved attention has been given to Meta tags and an enduring myth has evolved in the process.

  • Meta Tag Myth: Meta tags are what propel pages to the top of the search engines (wrong!).
  • Meta Tag Reality: Meta tags, while potentially useful to describe your Web page contents to a web browser or search engine, have no appreciable effect on actual search engine rankings whatsoever. None! There seems to be this semi-absurd ongoing debate as to whether or not Meta tags should be included in your HTML document. Let’s put that debate to rest once and for all. Surprisingly, the answer is a resounding YES! Here’s why.

While it is true that Meta tags will not help your rankings, it is also true that the Meta description tag should absolutely be included in every Web page document that you want described on the search engine results page. That’s because the Meta description tag is used by many search engines as the summary description for your page when your page is listed in the search results. The contents found within the Meta description tag is often the sub-headline and the sales description for your link! (Remember! …your <title> tag is the headline that is displayed in the search results.) The Meta description tag, when displayed in the search engine results, helps the searcher decide whether or not your page is relevant to their search. It’s what compels a real person to click your link. After all, that’s the reason for being listed by the search engine in the first place!

If you omit the Meta description tag, then the search engine is likely to fabricate a description for your site based on arbitrary text gleaned from somewhere on your page. Here’s an example of a terrible, yet real life search engine results description we found when searching for Hawaii scuba diving:

  • Link Title: Scuba Diving Maui Hawaii
  • Summary description: click to go home Now, we’re pretty sure that this company didn’t really want click to go home used as their page description, but that’s what they got because they failed to use a Meta description tag. Another possibility is that the search engine will omit the summary description entirely if it fails to find anything useful within your page to use as a summary. In either case, a potential site visitor is less motivated to click your link if you fail to properly utilize the Meta description tag.

Hence, in every case where you want a description for your link within the search engine results, be certain to include a relevant and enticing Meta description tag.

The following example illustrates the HTML source code located at the very beginning of a very basic Web page. Below you can see the Meta description tag, and its contents, highlighted:

<html><head>
<title>Cell Phone Accessories</title>
<meta name=”description” content=”The latest in cell phone accessories at the lowest prices for every known brand of cell phone on the planet!”>
<meta name=”keywords” content=”cell phones, Leather Cases, Cellphone holders, Antennas, antennaes, chargers, batteries, face plates, flashing batteries, hands free head phones, headphones, range extenders, bateries”> </head>

The only other Meta tag that you may hear discussed is the keyword meta tag, which is no
longer observed or used by major Search Engines:

<html> <head>
<title>Cell Phone Accessories</title>
<meta name=”description” content=”The latest in cell phone accessories at the lowest prices for every known brand of cell phone on the planet!”>
<meta name=”keywords” content=”cell phones, Leather Cases, Cellphone holders, Antennas, antennaes, chargers, batteries, face plates, flashing batteries, hands free head phones, headphones, range extenders, bateries”> </head>

We do not recommend wasting any time at all on the meta keyword tag. The only time you should be concerned about it is if you’re working on an old web site that has spam laden meta keyword tags, if you encounter this, remove them as they can still be seen as a technique used by spammers. Oh, and by the way, NONE of the other Meta tags have any effect on search engine rankings, whatsoever! …never have and probably never will—no matter what you’ve heard!

How to Customize the Way Your Listings Appear in Google

Google now provides a way for you to customize how your listings appear in the search results. Previously, you were limited to just titles and descriptions, but now it’s possible to get star ratings, product images, prices, business addresses and more included with your search results listing. For example, take a look at this cafe listing from Yelp.com

yelp

You can see star ratings, number of user reviews and average price range for a meal. Google is displaying this data using a feature they call Rich Snippets—information extracted from indexed pages that have special tags embedded into their HTML code. Those tags come in two forms, microformats and RDFa.

While this might sound complicated, these formats are about as easy to master as regular HTML. And, although developers haven’t yet settled on a standard, the fact remains that you can use either microformats or RFDa (we find microformats a little easier). Then you simply denote certain data on your pages by wrapping them in tags with descriptive class attributes. For example, to create the listing above, you would wrap portions of your page’s data in tags that describe that data as seen below.
<div>
<span>
Cafe Cakes</span>
<span>
4</span> out of 5.
<span>
28</span> reviews.
<span>
$</span>
</div>

Wrapping everything in a hreview div tag lets Google know it’s a review. Then you use name, rating, count and pricerange span tags to add the other information. So far, we’re seeing these Rich Snippet listings just for restaurants and cafes, but Google is working on rolling them out to more categories.

Google provides examples and tutorials on Rich Snippets for the following:

  • Reviews
  • People
  • Products
  • Businesses and organizations

Currently, business directories and others sites based upon reviewing and categorizing other businesses stand to gain the most from having Rich Snippets added to their pages. However, as Google expands this program it’s likely to become relevant to many other types of Web sites as well. In general, listings that are enhanced with Rich Snippets can expect to increase their click through rate—so we highly recommend them.

Be Careful with Session IDs and Dynamic URLs

Session IDs are unique identifiers, often embedded in URLs that allow a Web site to track a customer from page to page. For example, when shopping at an ecommerce site, session IDs are used to keep track of the items in your shopping cart. For search engine spiders, however, session IDs can cause a problem because they can inadvertently create a huge number of links for the spider to crawl. The danger is that the spider might repeatedly index what is essentially the same Web page over and over. They can get trapped in a loop as each newly crawled page dynamically generates even more links for the spider to follow. They call this a ‘spider trap.’

Here’s how a system that uses session IDs can give the appearance of generating an endless number of pages within a single site. For example, a link with session ID tracking that looks like…

http://www.yoursite/shop.cgi?id=dkom2354kle03i

…is served to the spider when it first downloads one of your Web pages. That page is then processed, but when the spider returns to the site to download more pages, it finds another URL that looks like:

http://www.yoursite/shop.cgi?id=hj545jkf93jf4k

It’s actually the same page—only with a different tracking session ID variable. But to the spider it looks like a brand new URL, so the spider can get trapped downloading the same page over and over again. This problem can also result in duplicate content getting indexed in the engine, which can lead to a reduction in ranking. Although Google is constantly striving to improve their ability to crawl session IDs, we recommend you avoid using them whenever possible. However, when you must use them, you should avoid giving search engine spiders access to them. The best plan is to not use session IDs until you actually need to track the state of your customer, such as when they take an action such as adding items to their shopping cart. You can also store your session IDs in cookies instead of your URLs. Most web applications can be configured to store user state in cookies. And, once again, if this sounds complicated, then have your web or tech people handle this element of your Web site architecture. What YOU need to know is that the more dynamic variables you include in your URLs, the harder it will be for search engines to index your pages. Strive to keep your URLs simple and free of dynamic elements.

Sitemaps: What, Why, and How

First lets start out with the simple fact that there are 2 different types of sitemaps. According to Bing, the difference between Sitemap and sitemap is:
Sitemap, the capitalized version, refers to the XML-based files created specifically for the search engine crawlers. This version of the Sitemap provides the crawlers the “most important pages and directories within their sites for crawling and indexing.” sitemap, the lowercase version, is an HTML-based file that is for both the Web site user and the MSNbot. It’s essentially, a simple, clean and organized list of all the pages on your Website. HTML sitemap is an on-site Web page that links to all the other pages on your Web site. It ensures that any spider crawling your site can easily and quickly find and index all of your site’s Web pages. This type of sitemap is for spiders first and foremost but it can also be useful for Web site visitors. By linking your homepage to your HTML sitemap, you ensure that each page on your site is only one click away from your sitemap and only two clicks away from your homepage. This is the optimum Web site structure in terms of making Web pages easy for the search engine spiders to find.

As you now know, search engine spiders find new pages by following links from the pages that are already in their index. Thus, if you want a spider to crawl a new Web page, it needs to find a link from a page that is already indexed in the search engine. However, unless you have a very small site, linking to every page on your site from your homepage would look messy and unprofessional to your customers. Therefore the HTML sitemap enables you to accomplish this objective cleanly and professionally. Your Sitemap provides a list of links that Google’s spider can easily follow. This will help you get new pages indexed without cluttering your home page with links. That’s assuming of course, that Google has already indexed your sitemap. So, by placing a link on your home page to the Sitemap, and then links from the sitemap to the rest of your important pages, you make all of your site’s pages easy to find and index.

We’ve mentioned here the importance of linking to your HTML sitemap from your home page, but, for good measure, you should also place a link to your HTML sitemap on every single page on your site. That way, even if a search engine can’t reach your homepage for some reason, it can still easily access your sitemap and find all your other pages. By the way, this Web site architectural element should be considered standard operating procedure as the search engines themselves actually recommend you use a Sitemap to ensure that all of your Web pages get indexed.

HTML sitemaps for Large Sites

If you have a large site, you may be wondering whether it’s better to create one large HTML sitemap or several smaller ones. There are a couple of factors to consider:
First, the degree to which search engines will index pages and follow links on a page is largely determined by the quality of links pointing to that page (or to the site the page is on). If a site doesn’t have many incoming links, then it’s a good idea to make pages smaller and put fewer links on them. A good rule of thumb is to keep your pages under 101k file size of HTML code, and to put no more than 100 links on a page. Popular sites can easily get more of their pages indexed, but to be safe, use 101k of HTML code (don’t count the file size of images) and 100 links as the upper limit. Therefore, if your entire site is fewer than 100 pages, and you can create a HTML sitemap page smaller in file size than 101k, then it’s beneficial to use only one sitemap that points a search engine spider to the rest of your site.

There is an advantage to having only one HTML sitemap placed within the root domain. It enables the search engine spider to find and index all of your pages without having to traverse your site any deeper than two links beyond your home page. That’s one link from your home page to your sitemap, then one more from your sitemap to every other page on your site. This makes it easiest for the spiders to find every page on your site. However, once a HTML sitemap approaches 100 or so links or the file size of your sitemap Web page file size approaches 101k (excluding images), then it’s time to start splitting up your sitemap into smaller ones. We’d suggest linking to each sitemap from all your pages. Five sitemaps, for example, would require five links from each page instead of one. The end result would be that a spider would still only need to follow a maximum of two-links-deep beyond your homepage to reach all your pages. If, for some reason, it isn’t practical to place all five links to your five HTML sitemaps on the home page, then we’d suggest a single link on the home page that points to a master sitemap which, in turn, contains the five links to the five smaller sitemaps. This would require the search engine spider to travel three links deep into your site to locate and index all of your pages—which is still quite good. And again, link to your master HTML sitemap from all your pages, not just your home page.

Finally, we recommend that you avoid forcing a spider to crawl any deeper than three links beyond your home page to locate the rest of your pages. Using the site structure outlined above should allow you to easily accomplish that objective.

XML Sitemaps; How to Get Your Difficult-To-Index Pages Fully Listed

images (1)

You should carefully note the difference between an onsite sitemap (HTML sitemap) and an XML Sitemap. Your Web site should utilize both—as both are an important part of helping your site get, and stay, indexed by search engines. The regular HTML Web page sitemap (as explained in the previous two chapters) is simply an on-site Web page that links to all the other pages on your Web site. It ensures that any spider crawling your site can easily and quickly find and index all of your site’s Web pages. On the other hand, the XML Sitemap (aka a Google Sitemap, although it’s used by Yahoo and Microsoft as well) is a special file that provides search engines with specific directives about what pages to crawl and how often. Search engines are not required to strictly obey these directives but they do tend to use them as guidelines. This type of Sitemap is especially useful for very large sites that want to get all their pages listed. A great example of a large site that NEEDS to have a good XML Sitemap is an eCommerce site that wants to get its entire list of product pages indexed and listed in the
search results.

Please note that neither the HTML sitemap nor the XML Sitemap play any role in where your pages will rank. Both are simply a vehicle for getting your Web pages indexed most efficiently. Where your pages rank depends on your incoming links and other optimization factors. Bing also subscribes to the XML Sitemap protocol. You can submit your XML Sitemap in the same format that you use to submit it to Google by using their Webmaster Tools Service. For more on the Sitemaps protocol and how it can help your pages get (and stay) indexed by the top three search engines, be sure to visit Sitemaps.org:  http://www.sitemaps.org

Regardless, a Google XML Sitemap is really no replacement for clean and crawlable URLs, so a tool like mod_rewrite still comes in handy if you are attempting to simplify your dynamic URLs. If this sounds complicated, give it to your Web or Tech people who will probably tell you, this is actually pretty simple once you do it. An important note: If your site is already ranked in the search engines, be very careful about changing your URLs. Carelessly modifying your URLs after your pages have already been indexed and ranked is one of the worst SEO mistakes you can make! And, if you fail to effectively tell the search engine where to find the new location of the page, then the search engine will assume the page has disappeared and will drop it from their index. Not good. However, if you do change any URL, you must redirect the old URL to the new location. Visitors and search engines that are looking for the old URL will then be automatically redirected (sent) to the new URL, saving you lost search rankings while accommodating your site visitors.

Of course, in a perfect world, you’ll never need to move a Web page or Web site. However, if you must, then this tutorial is critical to your success. Without it you risk causing grave damage to your Web site’s rankings, especially if your pages are already doing well in the search results. You could easily lose all of your rankings if you get this critical procedure wrong. You have been warned! Note that if you’re using mod_rewrite to rewrite your URLs, the 301 redirect can be added to your mod_rewrite code

How to Use Robots.txt for More Targeted Web page Indexing

robots.txt

Your Robots.txt file is a tricky sounding name for a simple text file that is placed in the root directory of your Web site. Its purpose is to provide crawling directions to search engine spiders that are specific to your site. In other words, your robots.txt file tells search engine spiders which pages NOT to index. A common misconception is that a robots.txt file can somehow be used to encourage search engines to crawl a site. Not true! Most pages are eagerly spidered by search engines without requiring additional encouragement. As you are probably now noticing, an important part of SEO is identifying the elements that cause indexing difficulties for the spiders while eliminating these problematic elements. So, why would you ever want to tell a search engine NOT to index some of your pages? Well, because search engine spiders function with limited time and resources when indexing sites. Therefore your site will be better served by focusing on getting your important content, product listings, and sales pages indexed.

Case-in-point: Chances are good that you do NOT want a search engine to index your shopping cart. There is typically no benefit to you when your shopping cart checkout pages show up in the search engine results. Therefore, you would use your robots.txt file to make sure search engines don’t waste time indexing your shopping cart. That way they are more likely to spend their time on your site indexing your more important sales or informational content pages. Other pages you’ll want to keep search engine spiders away from include anything in your cgi- bin folder, as well as directories that contain images or otherwise sensitive company data. Whenever there isn’t any benefit to having a Web page (or image) displayed in the search results, then you should forbid the spiders from indexing it by placing the appropriate command within your robots.txt file.

That will not only help focus the search engine’s resources on your important pages, but will also provide the useful side benefit of protecting your site from hackers who may otherwise use search engine results to acquire sensitive information about your company or site. Search engine spiders are typically voracious about indexing anything they can find on the web, including sensitive areas like password files, so you must be careful. The robots.txt file can help you layer in some of the protection you need.

By the way, there’s one more issue to be aware of that relates to the robots.txt file. A surprising number of sites have inadvertently and unintentionally set up their robots.txt files to prevent search engine spiders from crawling any portion of their Web site (oops!). For example, the following two lines when added to your robots.txt file is enough to keep all major search engines from ever crawling your site. In other words, the following command tells ALL search engine spiders to simply go away:

User-agent: *
Disallow: /

This has been an area of confusion for some people. They use the wrong command and then they wonder why they can’t find their site listed in the search engines. So, be very careful this doesn’t happen to you! If you decide that you want to block a specific search engine spider, you should put the name of the spider to block on the User-agent line—NOT the asterisk. The asterisk (*) symbol is a wildcard meaning all. The Disallow line is where you put the directory that should not be indexed. Then forward slash (/) indicates the root directory, in other words your entire site. As you can see, the robots.txt directive above is a total shut-out of all search engines from your entire site. On the other hand, entries like this:

User-agent: *
Disallow: /cgi-bin/

…should (we say “should” because it’s technically optional for search engines to obey the robots.txt directives) prevent all URL’s in the /cgi-bin/ directory from being crawled. Keep in mind that these directives are case sensitive. If you want the spiders to crawl every Web page it can find on your site, there is no need for a robots.txt file. The only time you actually need to use robots.txt is if you want to restrict the crawler from some portion of your site. Google’s Webmaster Tools provides a report which will tell you exactly what URLs Google has attempted to crawl on your site but were restricted from crawling by your robots.txt file.

You can access Google Webmaster Tools at: http://www.google.com/webmasters/sitemaps

To see this report, go to Google’s Webmaster Tools (sign up and register your site if you haven’t already), click the Diagnostic tab, then click the Web crawl link. Finally, click the report that says URLs restricted by robots.txt to see what pages Google is not indexing due to commands in your robots.txt file. Google’s Webmaster Tools also offers a special robots.txt debugger which allows you to test specific URLs to see if your robots.txt file allows or blocks spider access. If you’re having problems getting pages indexed, be sure to test those pages against your robots.txt file using Google’s tool and see if you have a statement blocking Google’s spider. If you do, Google will show you what line in your robots.txt the blocking statement is on.

Be Careful with using Frames, JavaScript, and Flash

flash

There are a lot of myths and misunderstandings about the way search engines handle Frames, JavaScript, and Flash pages. The fact is that Web pages using these formats can only theoretically be optimized for search engines, but each presents its own unique challenges and difficulties. As a general rule, they’re best avoided, since pages that don’t use them are much easier to optimize for search engines. However, if you find that you must use them, or if you’re optimizing a site that’s already builtaround these technologies, here’s what you need to know to minimize their disadvantages.

How Pros Use Frames and Still Rank at the Top

Ever since we can remember, the use of the frame (also used with frameset) tag has been a thorn in the side of SEOs. They’re tricky to work with and if not done correctly they can kill your site’s chance of being crawled correctly and therefore ranking well in the engines. Because of this we’ve taken the stance that you shouldn’t use the frame tag unless you absolutely have to. The alternative we’ve suggested is the iframe tag, which is similar in nature and has been applied without issue. However, these two tags are different from each other and we’ve decided to take a little time to re-evaluate them and their use within your SEO campaign. First off, frames are an HTML element that pulls content in from another URL to the URL of your choice. It’s like copying everything on a page to mirror it on another page. Sometimes there are solid reasons to use frames on your Web site. Perhaps you have a legacy site that will take too much time and energy to change over. Perhaps you’re doing it for your affiliate campaigns. Regardless of why, if you simply must use the frame or iframe tags on your Web site, here are some guidelines to assist you in overcoming the potential disadvantages.

How Spiders Index Frames

These days Google and Bing are very good about correctly indexing content within <frameset></frameset> and <iframe></iframe> tags. However, the very important thing for you to understand is how the engines interpret the content. They crawl and index the non framed content on the site as one page and then crawl and index the framed content as an entirely separate page. That means they will not associate the framed content with the main page it’s being presented on. Think of the framed content like a large image. The img alt tag tells search engines, and your visitors when the image doesn’t load, what the image is. There is a special tag called the noframe tag, which is designed to instruct users and search engines what the framed content is when frames are disabled. Basic use is something like this…

<noframes>Put your keyword-rich frame describing content here.</noframes>

When using the <noframe> tag, ideally you want it as high up on the page as possible and to contain the text and/or links about the framed page (written using keywords pertinent to your site). This content will then be readable by the search engine spiders as well as by people whose browsers do not support frames.

Note: It is very important that the <noframe> tag is outside of the <frameset> or <iframe> tags so the search engines can find them.

We also feel it’s important to remind you that we do not use frames at all. We feel that frames do not add anything to search engine findability — nor do they add to product sellability. We have even seen problems caused by the framed content getting indexed directly. When this happens users land on a page that potentially has a lack of navigation or other necessary elements that make the page completely dysfunctional. Understanding the frame tag compared to the iframe tag We put together a sample template of a standard page using the frame tag to help display what areas will be indexed, and what won’t.

<html>
<head>
<title>Title text here will be read by spiders, regardless of frames</title>
<meta name=”description” content=”Text here will be read by spiders that read meta tags, regardless of frames. It is important to note that you should include a META tag description summary of all the frames on this particular page.“>

Also note that many search engines index the ALT attributes in the <IMG> tag. If your site mainly consists of graphics (bad idea!), you can also use the HTML ALT attribute to describe your page.

<meta name=”keywords” content=”Text inside the keywords tag is typically ignored by most search engines regardless of frames“>
</head>
<body>
<noframes>

Text here will be read by most search engine spiders just as if you were not using frames — this is the place to load descriptive text including your keywords. Make sure that the text is in a readable format, as it will likely get used in the search results for the snippet or description text.

</noframes>
<frameset cols=”25%,50%,25%”>
<frame src=”frame1.html” />
<frame src=”frame2.html” />
<frame src=”frame3.html” />

Text here will also typically get indexed by most engines but results may vary. The content within the
three files referenced above.

frame1.html, frame2.html and frame3.html will not be indexed along with this page.

</frameset>
<noframes>Text here will also typically be read and indexed by spiders.</noframes>
</body>
</html>

Now, we’ll take a moment to explain and show an example of the iframe tag

The use of the <iframe> tag is increasing to embed dynamic information and all sorts of widgets onto a site. Facebook’s “Like” button widget is an excellent example that uses the <iframe> tag to do its magic. What many don’t realize is that lots of these embedded iframe widgets typically don’t generate a link back to their site – which is one of the main reasons they generated the widget in the first place! However, if you set the code up as below with indexable content within the iframe tag, it will get indexed (including any links in that text).

<iframe src=”http://www.facebook.com/plugins/like.php” scrolling=”no” frameborder=”0″ style=”border:none; overflow:hidden; width:150px; height:50px;” allowTransparency=”true”>Content, and links will get indexed here by most engines as it is visible text on the page. Anything that is pulled in using the iframe tag, will not get indexed with the page. So if you want your iframe powered widgets to generate a link back to your site, make sure and include that code in this area.</iframe>

In summary, the frame tag and frame based Web sites are something to avoid whenever possible, even though there are ways to get some content indexed. The iframe tag on the other hand, when used correctly, can be a good method of link building. Just be aware that the content pulled in by the iframe tag is not going to get indexed as if it were static html code on the page. All in all, the take-away from this topic is to avoid using frames on Web pages that you want indexed in the search engines.

1. Understanding JavaScript

While it is true that Google can now find and follow JavaScript links, it is also true that Yahoo and Bing cannot. Therefore, you should think twice about creating JavaScript links. If you feel they are absolutely essential to your Web site’s overall design scheme, then pay close attention here to ensure you are setting them up properly. JavaScript links typically use a function called an onclick() event to open links when clicked. The onclick() event then calls JavaScript code which tells the browser what page to open. That code can either be on the same page, or it can be embedded in separate file. Currently, if the code called by the onclick() event is on the same page, Google will process the code, crawl the URL listed and pass anchor text and PageRank to that URL. However, if the code is in a separate file, then Google does not process it.

Here are some examples of code that Google can understand, with links that will pass both anchor text and PageRank:

<div onclick=”document.location.href=’http://foo.com/“>
<tr onclick=”myfunction(‘index.html’)”><a href=”#” onclick=”myfunction()”>new page</a>
<a href=”javascript:void(0)” onclick=”window.open (‘welcome.html’)”>open new window</a>

Remember that, even though Google’s can process these links, JavaScript is not the ideal format for your links. Neither Yahoo nor Bing can read and process these JavaScript links. In addition, JavaScript links generally fail to display properly on mobile devices and screen readers.

You should also be aware that using JavaScript to cloak (aka, hide) links for PageRank sculpting or to prevent paid links from being indexed, now requires that you move your code to an external file if you want to prevent Google from finding those links. Or, you can simply add the nofollow tag to links that you’d like hidden from Google in order to facilitate PageRank sculpting.

Another option is to put the script contents in a remote.js file. Here’s how: In your .html page, reference the remote.js file like this:

<script language=JavaScript src=”javascript/remotefile.js”type=”text/javascript”></script>

Then place your JavaScript code in the remote file (i.e., remotefile.js). The bottom line is that there are plenty of other ways to spruce up your pages. Besides, studies have shown that complex JavaScript and Frames in general tend to actually reduce sales conversions.

2. Macromedia Flash

  1. These are those animated, eye-pleasing, motion picture style, high end Web pages and they cannot be easily optimized for search engines. So you put your site at a ranking disadvantage if your site’s architecture relies heavily on Flash. Even though there are work-arounds and exceptions, in general, search engines have difficulty indexing Flash in any meaningful way. As you know, your keywords provide one of the most valuable elements for search engines to determine what your Web pages are about. However it’s difficult-to-impossible for search engines to reliably extract keywords from Flash files. This means that any part of your Web page that uses Flash will generally NOT lend itself to top rankings. However, you can still use some Flash on your pages as long as you observe these guidelines:
  2. Don’t make your entire page one big Flash file. Make sure your page has abundant indexable content outside your Flash file.
  3. If you’re just using Flash to animate part of your page, and the rest of your page is in normal HTML and contains your keywords, then search engines will know what your page is about by reading that HTML (even though they’ll likely ignore the Flash). However, if most of your page is embedded in a Flash file, then it will be very difficult for a search engine to know what your page is about. That puts your Web page at a serious ranking disadvantage.
  4. Use the <noembed> tag. This is a good approach to take if you simply must create all Flash pages. Flash programmers know that any link to a Flash file must be enclosed in an <embed> tag. HTML also contains a <noembed> tag. This is where you should put the HTML version of whatever Flash you’re using on that page. Not only does this give the search engine something to read, but it also provides an alternative for those users who don’t have Flash installed in their browsers. Although Google is getting a little bit better at indexing Flash pages, they still don’t do it well. So don’t count on Flash pages to put your site on equal footing with non-Flash pages. You’ll be at a disadvantage, even with Google. Sure, there are sites like Oprah.com that use heavy amounts of Flash and do quite well in the search rankings. But that’s generally due to brand recognition and the accumulation of tons of links that propel them to the top of the rankings in spite of how unfriendly their site architecture might actually be to search engines. If you’ve got an Oprah-sized brand, and you really want that animated homepage, then by all means take the Flash route.

The bottom line is that Flash pages will always be disadvantaged in the rankings. But, if you must use them, then use the methodology outlined above to make sure the keywords you want indexed can be found outside your Flash files.

Lesson 6 Review

In this lesson you learned:

  1. The importance of designing search-friendly pages using the correct site architecture that, ideally, places every page on your site no more than two links deep, and three links max.
  2. How and why to keep your URLs simple.
  3. The importance of managing your Session IDs and Dynamic URLs in ways that do not confuse the search engine spiders.
  4. You’ve been taught the ONLY correct way to configure your Meta tags and learned about the dangerous Meta-tag-myths that inexplicably continue to survive.
  5. How to control the way your Web page displays in the listings with Rich Snippets.
  6. All about Sitemaps; both onsite HTML sitemaps and XML Sitemaps that help get your Web pages more efficiently indexed.
  7. How a simple robots.txt file can be used to keep spiders from indexing your unimportant, or potentially confusing Web pages. This enables the spiders to focus on indexing only your important pages.
  8. The pro and cons of using Frames, JavaScript, and Flash as any major component of your Web site’s architecture.

By now you are rockin! You’ve come a long way baby and just for fun, the next lesson, THE FINAL Lesson, gives you a peek at the pinnacle of SEO from the perspective of the SEO Expert.

How to Kill Your SEO Rankings

web-design-kills-seo-smGoogle has created a lot of rules and policies over the years to make sure sites play by their rules. Sites that ignore them are ritualistically slashed from the search results on an ever increasing basis. These almost monthly massacres are going after more and more sites that are all doing the same things wrong!

This article is a new twist on the top things you can do to become one of the many nameless victims left after the next algorithm update (1…2…Google is coming for you…3…4…lock the door.)

seo-fail

Don’t be fresh and original. It’s like a horror movie plot you’ve seen 100 times when the pretty young girl runs screaming up the stairs instead of out the front door when the man in the mask shows up. Instead of writing your own original articles just copy articles from various other sites or ‘spin’ your content into gibberish keyword rich phrases that no human can understand.

If a Web site has a really good article, why try to compete with your own original content? You can just copy and paste it to your site and BAM content added. While you’re at it, lots of Web sites sell PLR (private label rights) article packs that have been resold and ‘spun’ hundreds of times. You’ll never have to write again if you post them onto your site and through the article directories without revising any of the content. Cutting corners like this means you’re guaranteed to be publishing duplicate content.

PrintWhen this happens, your site will be flagged by the duplicate content filters on Google and ranked lower than the sites that you’ve copied from.

Another way to ensure those rankings will plummet into the pit of despair is by writing original content but never updating it. You’re busy and have more important things to do. Leave the content up there, even if the events took place four years ago. Not only will your customers love reading about things that have already taken place but Google won’t have any new content to crawl. Your rankings won’t increase and the spiders will visit your site less and less.

** Don’t let your site turn into a gnarly dried up corpse. Take some clues from a couple of the many articles we’ve written regarding updating regularly with quality content here and here.

Keyword stuffing. Again it’s that moment when the killer in the latest instalment of a movie series has been shot, stabbed and buried and the last scene is that hand coming up out of the ground as the audience does a collective eye roll (Friday the 13th Part 13 – seriously?!).

There’s nothing that readers love more than reading an article that has the same keyword in there 30 or 40 times. For example if your keyword is video marketing be sure your content reads something like this:

“Video marketing is one of the best video marketing strategies to get hits on your Web site. When you find video marketing companies, make sure their video marketing strategies are…”

If that doesn’t do the trick you can develop a great looking site and add SEO copy at the bottom that provides absolutely no value to your readers. To try and lose a few extra spots in the ranking, make sure that you really cram those few paragraphs with as many keywords as possible. The search engine algorithms have been adjusted to look for these tactics, so you can certainly trash your rankings by following these steps.

** That’s why keyword stuffing was our first spamming technique to avoid like the plague in this feature article The Top 10 ‘On-Page’ SEO Spamming Tricks to Avoid Like the Plague.

Continue to think that panda and penguin are two animals at the zoo. Here’s a quick hint if you’re confused – Panda and Penguin are two of the latest updates to Google’s algorithm to make sure that Web sites are naturally integrating SEO. If you ignore all of the latest news on SEO and don’t change anything about your site just wait, perhaps both Panda and Penguin will get their fangs into you.

** Or if you’d like a silver bullet in your arsenal check out this month’s update to get links to all the most current topics regarding Panda and Penguin.

Make your site all about the Ads. Ads make you money so the more ads the better…right? Wrong, sites with excessive ads being displayed above the site fold (Using more than 3 AdSense units per page? You may want to lower that) are being dropped from the rankings quicker than they can wonder what happened to their ad income.

** Hit the right balance of ads and content with hints from this byte How much content do you have on your home page? Is it enough?

Use IP Delivery: aka, Cloaking. IP Delivery, most commonly referred to as cloaking, is one of the most controversial and complex SEO strategies. We’ve covered it often over the years. The root of the controversy stems from the fact that IP Kill Rankings delivery enables a site to serve one page to human site visitors and a different page to search engine spiders. When used in this fashion, the site is said to be “cloaking” (i.e., hiding) the real page – the one that human site visitors see – from the search engines.

** Search engines do not like being served a page that is different from the page that people are seeing. That being said, if you don’t understand the technology – don’t use it because when done incorrectly it’s like taking a sledge hammer to your rankings. It’s a slow recovery and the Googlebot is a fickle host who wants what it wants.

Use Doorway pages. Aggressive SEOs have been known to create large numbers of low-quality pages with the intent to rank well for as many keywords as possible. Often these so-called doorway pages are automatically generated using software designed to optimize each page around a specific long-tail keyword. The goal is to funnel visitors from these doorway pages into a smaller number of highly converting sales pages.

** Of course, it’s fine to use content management systems and other software to manage your site as long as your actual content is being generated by humans. Obviously using software to automatically generate your content and send visitors from one site to another is another great way to get the Googlebot to send your site into oblivion. This will also make sure you violate Google’s Terms of Service so that you even risk being removed from the search index entirely. Ouch.

Use graphics and scripts. Instead of using actual text for your Web site, you can decrease your rankings easily by presenting everything on your site through the use of graphics. Hire the most expensive graphic designer you can find and focus only on the visual aspects of your Web site. You will have the most attractive Web site on the Internet that isn’t getting any kind of traffic and probably runs more slowly than the zombies from Shaun of the Dead.

** If you’d like to give your site a fighting chance against the invasion check out these resources for ways to incorporate things that add visual interest like well-optimized images and infographics without bringing your load time to a screeching halt.

Don’t use keywords. Keywords are for people who actually want to be found on the Internet. When people enter keywords that have anything to do with what your site is about, make sure that you don’t include these throughout any of your Web content or in any of the on page SEO spaces such as your title and header tags. You’ll save a fortune in keyword research and since the entire Internet runs on keywords, you never have to worry about appearing in the top results of a search engine. Your site can be like Big Foot, always rumored about but never seen!

** Or you can bring your site to live amongst us and target your keywords with our three part Keyword Primer Series Part One, Part Two, and Part Three.

Build your site around a brand name that isn’t yours. If you infringe upon the copyrights of another company, you may not only experience a lawsuit but you may be able to get your Web site completely wiped from the indexes. All you have to do is create an entire site about an already established brand – or your competitor’s Web site and poof! Rankings gone.

** Use your own IP (intellectual property) and you should be in the clear. Check with your attorney if you are unsure on anything.

double-facepalm1

Participate in link schemes. It’s like that moment when you notice an overly large clown peering out at you from a storm drain. Logically you know clowns are creepy and that no matter what he’s promising it’s not worth climbing into the gutter for. Same goes for links – if it’s too good to be true, chances are it is.

The worst place to start your link building is using one of the various link schemes. That would include the good old fashion link farm, cross domain linking, obvious paid links, and of course linking to ‘questionable’ sites (online gambling, porn or certain foreign sites). Google polices link schemes heavily, so your site will drop down in rank considerably within a very short period of time. Even better, submit your site to dozens of link farms all at once to absolutely ruin your rankings and the scary clown wins!

** Punch that creepy clown in the face and review ways to mitigate your linking problems here or start right from the beginning with our advice in this byte.

Hide your text. What your readers don’t see won’t hurt you. And the search engines love to read all of that hidden text. What you want to do is make sure you have all of your keywords in row after row and then make sure the text color matches the background color so it blends right in.

For instance, if the background is white bgcolor=”#FFFFFF”, then the text font is also set to white as such: font color=”#FFFFFF” (of course, this can also be done with CSS instead of font tags). The end result is text that blends invisibly into the background color of the Web page. It effectively appears invisible to people viewing the page, but not to the search engine spiders that index all of the text found within the HTML source code. Do this for every page and you’ll be down in the rankings in no time at all. Because sometimes those things you can’t see can still hurt you.

** Just avoid doing this, there is not really anything more to say other than that the search engines do not like being fooled.

cats_fail-14143

Keep ignoring your site’s load time. That’s right – keep ignoring the countless times we’ve lectured you on your site’s load speed. If you don’t know your site’s speed, then you’re no better than the girl in ALL of those movies that can’t out run the killer who never even breaks a sweat as he strolls along. Quit tripping over things that aren’t even there and act like you want to live.

** Google has made it 100% clear that your site’s load speed is a ranking factor. It should load between 0 and 5 seconds and if it doesn’t you need to start working on what’s going on under the hood. Pick yourself up off the ground and use tools.pingdom.com/ to be sure your site is fast enough to survive. Google’s serious enough about it they’ve even added site speed tools to their Analytics. If your site isn’t up to snuff check out this byte to troubleshoot so that your site loads before the lunatic catches up.

Ignore errors on your site. You’ll get around to fixing them…eventually. In the meantime, it will disrupt performance to your site, piss off visitors and cause you to index poorly.

** No big deal that both your Webmaster Tools accounts and our own Super Spider Tool will quickly lay out the issues on your sites in a matter of seconds.

As you can see it’s easy enough to ignore our advice and make a mockery of the SEO system, waste millions of marketing dollars and thousands of hours of intense online SEO work. If you’re doing even one of these 13 things we’ve outlined and you’re still ranking well then consider yourself one of the lucky survivors …for now.

Advanced Link Building Techniques

The #1 *Secret* ART Used By Top SEOs to Gather Links In Bunches!

Linkbaiting – posting content that attracts attention and links – is not a secret strategy. However, the ART of linkbaiting sure seems to be. In any case, it’s an art worth mastering because:

  1. Link Bait can be used as an extremely powerful form of viral marketing.
  2. Getting Links is the most expedient path to the top of the search results.
  3. Best of all, it’s a strategy Google tacitly approves of.

In today’s world of SEO, you just won’t find a better one-two-three Kung Fu punch! Nor is there any faster way to gain an (unfair?) advantage over your competition! But, before we get to the “ART” part, let’s spread out the canvas and look at the broad strokes.

6 Key Characteristics of an ARTful LinkBait Strategist

ARTful Linkbait Strategists do some or all of the following…

  1. They distill multiple sources of information into Condensed, Cumulative Reports – thus creating Valuable Resources. In some cases they develop the ability to generate these resources internally. In other cases they outsource the tasks. Either way, they post valuable lists, reports, or tools that are 100% informational, highly useful, streamlined, and easy to link to – all while presenting the material from their own perspective.Some of the best linkbait resources are simply lists and how-to guides. In our report: How to Use del.icio.us to Quickly Build Amazing Content-Rich Articles that Attract Links Like a Magnet! we discussed the details of creating such resources. This report is a step-by-step guide to dominating the search results and we highly recommend you read and act on it, if you haven’t already.

    It isn’t necessary that your linkbait be “amazing” to get links. But it must look comprehensive and be useful. Make a list of topics your customers are interested in and start cranking out articles that are loosely focused within your niche having titles like…

    • 100 Tools and Resources for…
    • 50 Tips to…
    • 100+ Resources to Help You…
    • 75 Online Resources for…
    • 30+ Simple Things You Can Do to…
    • 25 Ways to…

    Be sure to mine sites like Delicious and Digg for ideas on article topics that people will link to. If your target topic area doesn’t participate on Digg or other newer social networks, consider mining forums. Look at the posts that have the highest number of page views and replies, and how frequently questions get asked about a certain topic. That’s a clue into the mind of the forum participants and an opportunity to solve a problem with a link bait answer. Ideally, we suggest you crank out one of these articles on a weekly basis.

    At its simplest, an article can be “100 Resources for…” followed by a list of 100 sites, tools, and articles you’ve found online related to that topic, with a short summary of each. You’ll be pleasantly surprised to learn how many people tend to link to such articles and resource lists.

  2. They systematically monitor the News. They watch for relevant stories that they can chime in on. By lending their expertise to the topic in a timely manner, they present a voice on the subject that tends to gather links.As an example, let’s suppose that we sell remote controlled model airplanes. It’s pretty simple to monitor Google News for key phrases like remote controlled aircraft and, on a weekly basis, we can review news stories within our niche and then write about what is topically hot. Here’s one story that caught my eye…
    Stanford’s “autonomous” helicopters teach themselves to fly
    Stanford Report – The aircraft, brightly painted Stanford red, is an off-the-shelf radio control helicopter, with instrumentation added by the researchers…

    Any one of a number of slants could be employed while slicing and dicing this story. For instance, how will this effect the sport of remote controlled aircraft for the hobbyist if, in fact, the flying can be done by a computer that learns? …you get the idea.

  3. They’re constantly looking for opportunities to present a Contrary Perspective on a controversial topic. They argue a position that contradicts prevailing wisdom.One controversial perspective could be that the price of gas hasn’t actually risen much at all. Instead, it’s the dollar that’s fallen evidenced by the price increases across a broad range of commodities which might include homes, gold, imported foods, and so forth. The fact that the federal reserve has kept the interest rates so low for so long, and that they’ve been paying the government’s bills with printed money, could be used to support the premise.

    If you are a commodities broker or financial services specialist, this could be a contrary perspective for you to write about within this currently hot and controversial topic.

  4. They intelligently harness the power of Fear. This is my least favorite strategy (blush), however there are certain industries that lend themselves to this style of social promotion. Topics like personal privacy, ID theft, radar detectors, child care, insurance and so forth have successfully harnessed the ‘power of fear’ to promote their goods and services since their beginnings.Once again, the News Feeds are an excellent source for stories that can be re-reported in the light of an “expert’s” opinion (like yours).
  5. They employ the universal language of Humor. Ok, if the iPhone in the blender didn’t seem funny to you, then maybe you could take the Jeff Foxworthy approach:
    • If you cringe whenever you see a site listed as ‘Untitled‘ in the search results above your listing, then you might be an SEO.
    • If your response to your spouse’s inquiry regarding your son’s whereabouts is: ‘he’s 404‘ …then you might be an SEO.
    • If you’ve ever spoken in terms of having to ‘301 redirect‘ your focus from one task to another, then you might be an SEO
  6. And, in every case…They develop their social networks – This one was too big for a single bullet point, so read on..

Baiting the Linkbait Hook

It’s not enough just to publish your linkbait – you’ve got to get it seen by the people who are most likely to link to it. That’s where social media comes in. StumbleUpon is particularly effective for generating significant traffic to your articles.

Other social media sites such as Delicious, Mixx, Propeller and Digg work well too.

In any case, it isn’t realistic to expect your content to leap to the top of the social media sites UNLESS you become, and remain, actively involved. That means regularly submitting and voting on stories and building a network of social media friends.

We go into extensive detail on building a friends list in our StumbleUpon report. We highly recommend you study that report closely – while bearing in mind that all major social media sites have their own version of the friends system. Most importantly…

Understanding how “friends” networks operate
is critical to your social media success.

Another characteristic of ARTful linkbaiters is their disciplined approach to staying involved. At least a few hours each week are dedicated to maintaining their social media profiles whether done personally or delegated to company personnel.

We know of at least one instance where an ARTful linkbaiter invested the time to identify the active users within her niche. Then, once her relationship was established, she contacted one of those power users directly and hired him to submit content on behalf of her company.

If you go this route, be sure to frame your offer delicately because some people are easily offended. It’s entirely possible they may report you to the social media site. In such a case there’s a chance your site could get blocked from that social media site. But, generally, if you spend some time on the site interacting with, and getting to know the people, you can get a pretty good feel for which users might be open to this kind of arrangement.

By the way, we’ve found that advertising for social media experts via traditional outsourcing sites like Elance or RentaCoder has always been a bust. So your best bet is to hire/train someone in-house with your second choice being the ‘hire an outside social networker‘ route. You may also find talented social media marketers via Jobs in Social Media, a site which we haven’t yet used, but looks potentially useful.

Here are some great example sites doing all the right things to become link magnets…

  • Techmeme’s Leaderboard – This lists the sources most frequently posted to Techmeme. Seems simple enough but this is a link magnet! Once Techmeme added this resource, all the sites that appear on this list have linked to the page at one time or another. Some of the sites have multiple high page rank links pointed here. Techmeme provided a simple way for sites referenced on this list to show instant credibility.
  • Vimeo – Similar to YouTube, this is a simple online video sharing site. They cleverly set it up so that once the player is embedded, Vimeo instantly gets 3 links back to their site. In return the user gets an great player, that is easy use and reliable. This widget/tool strategy is a solid one that is used by countless highly successful sites.
  • OK Cupid’s Blog – This is a highly creative blog that accompanies a online dating site. It has content that is very unique and compelling on one of the all time hottest topics – understanding the opposite sex! This blog is accomplishing being both “linkbait” and a “link magnet” and standing out from the mob of online dating sites.

The LinkBait Magic at work…

You should bear in mind that, while your linkbait pages will see an immediate boost, improved rankings for your main landing and product pages won’t happen right away. But if you consistently apply the strategies outlined in this report, your home page and product pages will also rise in the rankings. Here’s how…

  • First, any link you get to any page on your site will help all pages on your site rank better. That’s because these additional links increase the overall link authority of your site.
  • Second, by linking these linkbait pages to your main product pages, the link juice flows from the sites linking to you, through the linkbait page, and to your product pages.

It’s realistic to anticipate that each linkbait content you create (if well done and properly promoted) will typically average somewhere between 10-100 natural links from other blogs and Web sites. You’ll also get a bundle of links from the social media sites where you promoted your content since every user that voted will have a link to your content in their profile.

And, even though links from social media sites themselves tend to be nofollowed, they will still provide a small rankings boost – especially in Yahoo and Bing since neither of these engines recognize nofollow links as well as Google does.

Consistency and Persistence within your Social Network is Essential

By faithfully creating some form of interesting link-worthy content on your site every week or so and promoting it to a large network of social media friends, you’ll be (from the search engine point of view) continually building natural, unrequested links to your site’s content pages. This is far-and-away the single best strategy for ranking at the top of the search results.

You’ll often see increased search engine traffic right from the very first article you create and promote. And, even though Google says they don’t use the traffic data from the Google Toolbar to determine rankings, nearly every time we’ve created an article and gotten a surge of traffic from social media, that article ranked well and pulled in Google search traffic almost immediately. (go figure!)

So what’s the big secret?

Well, the sites that started LinkBaiting a year ago have significantly higher rankings today. And, if that’s you, then you know the secret. As with any strategy, you have to actually do it. And, since the hardest part is getting started, then I suggest you click yourself over to Google News right now, enter your keywords, grab a story and begin writing your ‘Editorial Opinion’ on that story today.

Tomorrow you can post it on your site and then begin building your social network of friends (if you haven’t already done so). Then, shake, repeat, and blend the different strategies as you develop your ART – this time next year, you’ll have experienced the secret – and be the one who’s smiling BIGtime

 

 

Optimizing Press Releases

5 Top Sites to Submit Your Optimized Press Release

By now you’ve learned why optimizing a press release for the search engines is important, how to optimize the release for both SEO and SMO maximum visibility and what the benefits of this optimization can mean for your business. Now comes the last step—the actual distribution of the press release.

There are literally dozens of both paid and free sites that will distribute a press release online these days. We’re going to look at the top 5, which control over 95% of the 2000+ releases distributed daily.

  1. Business Wire — A Berkshire-Hathaway company in business since 1961, Business Wire is the largest newswire dissemination service in the United States. Its ability to target your release is quite impressive and it has the latest in SMO bells and whistles (through a partnership with PR Web).Targeting can be either by industry, geographic locale or a key demographic. Membership is required (at $200 per year), and releases are priced higher than most other services at between $500-$600 per release. Often this higher price also gets you exposure to the bigger media outlets.
  2. Market Wire — A close second in market share to Business Wire, Market Wire has over 3,000 corporate clients and distributes releases to over 60,000 journalists daily. Market Wire is also the exclusive provider for press releases to the NASDAQ and NASDAQ-listed companies.If you’re in the finance arena, Market Wire is probably the best choice for you. Market Wire has traditional SEO capabilities and has now improved their SMO features considerably. Membership is required (at $200 per year), and also starts at around $500 per release.
  3. GlobeNewswire (formally, Prime Newswire)— Known for its large distribution network and its focus on finance and stock-related news and companies, GlobeNewswire has had a lot of industry firsts. These firsts include a free online clipping service, an online customer service center, and delivery of daily news alerts via HTML email. No membership is required. However, a full SEO/SMO optimized wire release is expensive here and requires a $250 upgrade on top of their normal $550 distributed target release for a total of $800.
  4. PR Newswire — This service averages over 1.9 million views on its Web sites each month. PR Newswire distributes between 700-800 press releases a day and among its many SEO friendly tools is a nice account-level keyword density analyzer that automatically flags your release for suggested optimization changes.PR Newswire requires a membership fee of $195, and $400 per release which includes 30-day tracking, a clipping service and a full suite of SEO/SMO and multimedia customization features.
  5. PR Web: This site is definitely the most user-friendly of private press release distribution sites, used by over 40,000 organizations internationally, and our personal favorite. As this sample release from their site shows, they have integrated practically every SEO/SMO optimization widget possible within their releases. PR Web offers four categories of releases ranging from their standard $80 basic visibility package all the way up to their premium visibility package which runs $360 per release.

So which Paid Press Release Service Should You Use?

We recommend PRWeb as your first choice in SEO and SMR press release distribution.

The recent declaration from Google aside, we still feel there is substantial value in the continued use of press releases. Although their link building power has clearly been diminished, their ability to generate NEW press mentions and social media signals are both still invaluable. Finally, remember to optimize your press releases both for the search engines and the social media networks and you will more effectively reach both your target audience and those news professionals looking for hot, viral content.

Lesson 5

Competitive_AnalysisWe all know that, before we optimize our pages for the search engines, we need to find out exactly how sophisticated the competition is, or isn’t. Our goal is to determine what the top ranking pages are doing within our keyword categories so we can look for weaknesses while quantifying their strength. Then we duplicate their efforts before going one better in order to achieve our success in the rankings easily! One problem is that, ordinarily, such competitive analysis is time consuming. Another problem is that the time consuming analysis needs to be repeated again and again for each site and each competitive keyword. Then, when compiling each metric one by one, it’s the kind of task that can make you want to tear your hair out (is that why so many SEOs are bald? …just wondering).

9 critical competitive data points and delivers an overall site strength score.

In other words, in one click, and all at once! …we learn:

  1. Domain Registration Date: This refers to the age of a site. The older the site, the more of an advantage in the rankings it will have. Google likes sites that are aged. If the site’s ‘birthday’ is a year or two old, the site has a slight ranking advantage over a newer site within this data-metric. Older sites tend to have developed more links over time and inherently are more trust worthy than a brand new domain name. If you click the date displayed, the SSI tool will display the WHOIS data for the domain name.
  2. Google PageRank: This refers to the Google Toolbar PageRank that rates the importance of a Web page in the eyes of Google on a scale that runs from PR=0 to PR=10. This metric is important because it tells you how valuable a link coming from a site may be. It also gives you a good idea of Google’s opinion of a site. Note: The Page Rank displayed is for the URL that you entered. It’s not limited to the home page.
  3. Google Cache Date: This metric tells you exactly when Google last crawled a site. A recent date is indication that Google thinks the site is important. On the other hand, if a site’s cache date is a month or more old, then Google probably thinks very little of the site’s overall importance. Very important sites are crawled frequently, even with moderately popular sites some URLs are crawled daily. Unimportant sites are crawled sporadically, perhaps every couple weeks or once a month.
  4. External Backlinks: Exactly what this appears to say. Although you shouldn’t expect this number to be perfectly exact, it will most certainly be relatively exact. Basically, if one of the sites you are analyzing is showing 19,576 links and the other site is showing 610, there is a significant difference of inbound links between the two sites. If you click on the Backlink result, you’ll expose a link to Majestic Site Explorer’s Backlink summary report. On this site you can observe the URL’s total links, and the links that have been discovered in the last 30 days as well.
  5. External .edu Backlinks: Since .edu domains are restricted to accredited institutions of higher learning (typically U.S. Colleges and Universities), Google often trusts these links considering them less likely to be commercially oriented. One way to boost a site’s ranking is to collect links from .edu domains. The more .edu links the better.
  6. External .gov Backlinks: Likewise, .gov domains are restricted to U.S. government entities. Google also tends to trust these links. One way to boost a site’s ranking is to collect links from .gov domains. The more .gov links the better.
  7. Referring Domains: This value reflects the number of different domains that are linking to the site. The higher the number, the better. More unique links from different referring domains than from the same domains. A site that has 1000 inbound links from 500 Referring domains is typically going to be far stronger than a site that has 1000 links from 10 Referring Domains.
  8. Referring .edu Domains: The number of unique .edu domains linking to the URL entered. This value works the same way as the Referring Domains Entry, except this is only the value for .edu domains.
  9. Referring .gov Domains: The number of unique .gov domains linking to the URL entered. This value works the same way as the Referring Domains Entry, except this is only the value for .gov domains. As you test the strength of sites using the Site Strength Indicator, you will also notice a Total Score is assigned to the site being analyzed.

Here’s the scores explained: 0 – 19: This site has a very limited search engine presence and is getting far less search traffic than it could if it increased its optimization. Building quality links is often the quickest way to improve search engine rankings. 20 – 39: This site has made some progress in achieving rankings, but is still far below achieving its potential. Targeting long-tail keywords is your best bet for traffic at this level. Link development should still be continued at this level as well. 40 – 59: While there is still a lot of work to do, this site is on the verge of breaking into the arena of major search engine presence. Creating some viral buzz could be enough to push it into the big leagues. A site at this level should have enough traffic to also take advantage of UGC (User Generated Content) such as comments, reviews, or forums to further enhance its ranking. 60 – 79: This site is a powerful presence on the Internet. In a small to medium niche it’s likely dominating, but can still be beat with the right optimization. 80 – 100: This site is among the most powerful and authoritative sites on the Internet. If this is your site, then congratulations! If it’s your competitor’s site, they’re going to be very tough to beat. The Total Score ratings, explained above, give you a pretty good idea of how easy or hard it will be to supplant a site that is ranking on the first page of the search results. Let’s suppose, for example, we want to place the site: www.homeschoolviews.com somewhere on the first page of Google’s search results for the keyword homeschool. Let’s take a look at the sites that are ranking #’s 1 through 10. We want to see if any of them are beatable in the rankings based on their lack of overall strength.

Tips Tricks and Traps of the SEO Trade

star_wars_its_a_trap Discussions regarding acceptable and unacceptable SEO practices often devolve into a polarized conversation that splits into two groups: White hat vs. Black hat SEO practices. However, we think it’s a silly conversation. That’s because, unless you are doing something illegal (which we do NOT recommend!), we think the more productive discussion should center around: What works vs. what doesn’t work & Safe vs. dangerous SEO tactics As an SEO, you need to know what SEO tactics are working. This is especially true if your competition is using it. Never mind that somebody out there is crying ‘black hat’ …that, alone, should be irrelevant to you. If it isn’t illegal, you can expect that your competition will use it against you to gain a competitive advantage. Bank on it! The REAL question should be: Is the SEO strategy safe? In other words, will your site be penalized or banned if you get caught applying whatever specific SEO strategy to your Web site’s pages? If the answer is YES, then the advise you to refrain from implementing an ‘unsafe’ SEO strategy. Example: Let’s suppose that you learn, while conducting competitive analysis of your competition, that they are hiding keywords under their images to artificially boost relevancy in the search results. They are also hiding links that direct spiders toward topically unrelated pages for indexing and PageRank purposes. And, as far as you can tell, it appears to be working due to the fact they are ranked on the first page, in the top ten of the search results. In addition, all of the pages their hidden links are pointing toward are indexed and ranking well too. Now, because you have carefully studied our Advanced SEO Tutorial; The Top 10 ‘On-Page’ SEO Spamming Tricks to Avoid Like the Plague! …you know for sure that Google does NOT approve of such a tactic. It clearly conflicts with Google’s Webmaster Guidelines which specifically states that hidden text or hidden links should be avoided—including text behind an image. However, one must consider that there isn’t anything illegal about hiding text or links behind an image. So, if your competition is using this tactic to compete, then you at least need to know about it. Now you have some options to consider.

  1. Ignore it and take the safer SEO tactical road hoping that Google will eventually discover the “cheat” and penalize your competitor’s site accordingly.
  2. Report your competitor’s guideline infractions to Google and hope that Google evens the playing field by penalizing your competitor. (Don’t hold your breath).
  3. Match your competition tactic by tactic (regardless of Google’s acceptable use guidelines) knowing full well you’re engaging in an unsafe (although not illegal) SEO tactic.

So, what’s our recommendation? We prefer choice #1. Although, we realize you may choose choice #2 for reasons that are entirely understandable. We do not recommend choice #3 for the simple reason that there’s no guarantee whatsoever that Google will refrain from penalizing your site while leaving your competitor’s site unscathed. Yes, we know—unfair! …but it happens and there is no easy way to predict what Google will actually do in such circumstances. You also need to know there is something called ‘white-listing’ a site. In essence, that’s when a site is considered so trustworthy by Google that they never assess any ranking penalty regardless of whatever SEO tactics such white listed sites may be using. Typically, such sites belong to brand name companies like Microsoft, Nike, Amazon, and so forth. So, if you’re competing against one of these so-called brand names, then be aware that they may be operating under a less strict set of rules. Yes, we know—again, not fair. But it is the way it is. Make your adjustments and move forward. Here’s the point. The Profession of SEO is full of tricks and traps—and you absolutely must KNOW what each of them are in order to compete on the big stage. That’s why we have devoted an entire Advanced SEO Tutorial to the Top 10 ‘on page’ SEO tricks that you absolutely positively need to know about if you are to successfully analyze what your competition is doing to achieve top rankings. When you are ready, be sure to carefully study:

What’s In a Name?

choosing the right domain name

As you’ve no-doubt deduced by now, choosing the right Domain Name can be critically important! Not only does Google currently give preference to keyword-rich domains, so do people when they’re choosing which link in the search results to click. Another critical truth to factor into the equation is that you should AVOID changing the domain name of an established site. Once a site (domain name) is established and ranked within the search engines, it is NOT advisable to switch the domain name. That’s because switching from an established domain to a new domain can be a tricky process at best. At worst, you will lose all of your investment in time and effort that you’ve spent promoting the site. In essence, you could find that you are literally starting over. Not good. This can often put your site at a ranking disadvantage for upwards of six months to a year. What’s more, Google places a high value on old sites—the longer a site has been around, the better. In fact, two of the most important ranking factors in Google are:

  1. The quality and age of your incoming links (i.e. the importance of the linking page combined with the length of time the link has been pointing at your site).
  2. The age of your site (the length of time your domain name has been online).

Compared to having an old and established site, your domain name is often way down the importance-list of ranking factors. So, given a choice between a great domain name and a site that’s been aged, we’d advise the latter—the older site over the choice domain name. Although, it is possible to efficiently and properly move sites, (something we’ll discuss in the next lesson) our general advice is to *not* change a domain name unless you absolutely have to for legal reasons (such as a domain name containing someone else’s trademark) or if the new domain is absolutely so amazing that you can’t pass it up. Otherwise, we advise that you stick with an established domain name if you already have it up and running with incoming links and indexed content. But if you are starting fresh, then you should make every effort to get the best, most keyword-rich, domain name possible. The ideal domain name exactly matches the primary keyword or short (2 or 3-word) keyword phrase you’re targeting. Examples of such domain names might include:

  1. http://www.HotTubs.com
  2. http://www.PetSupplies.com
  3. http://www.SalesTraining.com

However, in most cases, these keyword-exact-matching domain names are no longer available on the primary market. They are already being used (or held) by other businesses or professional domainers people whose business it is to buy and sell domains for a profit on the secondary market. (Note: domain names are not case-sensitive. We’ve capitalized letters in the above domains to make the keywords within them easier to spot.) This means that really good domain names are usually not available for cheap from a domain registrar, but instead, must be purchased directly from someone who already owns the domain. Oftentimes you can contact an owner of the domain directly via the domain’s WhoIs info or by using contact info on the site (if it exists). To check the WhoIs info (the background information on who owns a site), use a site like domaintools.com: http://whois.domaintools.com/. There you can simply enter the domain name for which you would like to check for availability. You can also buy already registered domains on domain auction sites. Some of the best include:

  1. http://buydomains.com/
  2. http://www.snapnames.com/
  3. http://www.enom.com/
  4. https://www.tdnam.com/ (GoDaddy.com’s Domain Name Aftermarket)

Bear in mind, however, that many times these resale domains can cost several thousand dollars to purchase. And, while having a domain name that exactly matches your most sought after keyword can definitely help boost rankings, it’s not absolutely essential to building a high ranking site. Instead, you might choose a short, catchy name that will be easy to market, both online and off. Just a few well known examples of domain names that have been turned into globally recognizable brands include Amazon, eBay, Google, MySpace, Facebook, Twitter and del.icio.us (now, delicious.com). Besides being easier to build a brand around, the shorter name also provides the advantage of being easier to market through other media, such as radio, print, or television. Sometimes you’ll get lucky and find a domain that uses your keywords and is short and catchy. If so, grab it! You’ll have the best of both worlds. The drawback to the short, brand-able domain name is that, since it lacks keywords, it supplies absolutely no search engine ranking benefit. From an SEO perspective, it does nothing for your site’s rankings. Your other option is to buy a domain that incorporates your keywords but doesn’t exactly match the keyword phrase you’re targeting.

Examples might include names like:

  1. http://www.HotTubSource.com
  2. http://www.PetSuppliesCenter.com
  3. http://www.SalesTrainingExperts.com

While you may not get the often significant rankings boost you’d get if your domain exactly matched your most sought-after keyword, using a domain name that incorporates your keywords in some variation will still contribute to high rankings for the following reasons: Whenever another site links to you, you want them to use your keywords in the visible text of their link (the anchor text). Having your keywords in the anchor text of links pointing to your site is one of the most important aspects of high-ranking pages. Most of the time when other sites link to you they will typically use your domain name as the anchor text. If your domain name already includes your best keywords, that makes it natural for them to use your best keywords in the anchor text. It also saves you the trouble of having to track down those links and request the site to change the anchor text—a task that is awkward at best. In regards to visitor click-throughs, having your keywords in your domain name can make a big difference. When browsing through the lists of links on search engine results pages (SERPs), our studies have shown that people are far more likely to click links that contain the keyword they are searching for in the domain name. Even if your competitor provides otherwise similar-to-equal enticements, if they’re lacking the specific keywords in their link, their link is more likely to be passed over. This selective click-through behavior especially pertains to web users who access the net via slow Internet connections. Because of the sometimes excruciating delay between clicking and page loading, users on slow connections tend to study each URL more closely. They look for keywords and evaluate beforehand whether or not a link appears to be worth the time investment it takes for a page to load on a slow connection. As long as your keywords are somewhere in your domain name, search engines will generally assume your domain is relevant for those keywords. The exception to the rule: excessively long and/or multiple-hyphenated keyword-stuffed domain names. For example, a plastic surgeon specializing in breast enhancement in Beverly Hills might consider registering and using this as one of her domain names… www.cosmetic-surgery-breast-enhancement-beverly-hills.com At one time, such a domain had a boosting effect on rankings. But that effect has since been radically diminished in respects to domain names with excessive dashes and characters. As you can see, the above domain name has 49 characters including five hyphens (dashes). Too many characters and too many dashes make this domain name look unnatural or spammy from the SEPOV. Therefore, we recommend no more than one hyphen at most—preferably none; and to create the shortest domain name possible based on what’s available and what makes sense to your potential site visitors in relationship to your business. In a perfect scenario, your best domain name is typically your primary keyword or keyword combination. Whenever that isn’t possible, at least try to get your most important keyword inserted somewhere into your domain name. The other drawback to domains with hyphens is that they’re harder for customers to remember. They can also be more difficult to convey by word-of-mouth since people have to say “cheap dash art dash supplies dot com” instead of just “cheap art supplies dot com.” When we purchase domains names, we often buy both versions; with and without dashes separating the keywords. By purchasing both versions, you can keep these alternative domains out of the hands of your competitors. It’s also a good idea to register common misspellings of your domain name, especially these days when annual domain fees are so affordable; about $10 a year at GoDaddy.com. Misspelled and other alternate versions of your domain name can then be redirected to your primary domain name so that anybody who inadvertently goes to the “wrong” URL will still end up in the right place (don’t worry, we’ll cover URL redirection in the next lesson). However, you should avoid registering domains with double–dashes separating keywords, like: www.art–supplies.com Not only does this make it even harder for your customers to remember your domain name, but search engines may not even index your sites that use double dashes in the domain name. The reason why is because this strategy has been abused in the past. Overall, buying the .com, .org, or .net version of a domain name that exactly matches your primary keyword phrase is one of the most effective strategies for building a potentially high ranking site from the ground up.

Domain Names that Please Customers and Search Engines

When designing your site, striking the right balance between a site that search engines like and a site that your customers like can be tricky. Unfortunately, choosing a domain name falls directly into that same tricky category. Here are a couple of guidelines to help you choose: If your intention is to build a long-term online business that will be marketed through a variety of online and offline media, then go with a short, catchy, trademarkable name that you can easily build a brand around. OR If your intention is mainly to get ranked with the search engines, then go for the keyword-rich domain name. Regardless of your exact approach, getting your keywords into your domain name will always boost your search engine ranking—particularly when your domain name exactly matches the keyword or keyword phrase you’re targeting. Even if your company name is your domain name, such as PlanetOcean.com, it’s still a good idea to register other relevant keyword-rich domain names pertaining to your goods and services whenever you find them available. For instance, we also own… This gives us the option of setting up specialized sites that place our services in the paths of people who are using the search engines to locate our specialty product or service. It’s like buying online ‘real estate’ for the future. There’s always a good chance the time will come when you’ll want to develop these vacant ‘properties.’ This strategy also keeps these choice domain names out of the hands of future competitors, making it harder for them to enter your market.

Choosing a Domain Extension: .com? .net? .org? .biz? .info?domain names

When researching possible domain names, you should see what’s available in the .org or .net categories. These are given equal billing with search engines and it may be easier to find the domain you want if you target those extensions. You are likely to find they are generally more available to register, or else can be purchased from the owner for a cheaper price, than the .com version. If you’re outside the U.S., or targeting a market outside the U.S., country-specific domains such as .co.uk or co.in are also a good choice. Such domains will typically give you an advantage in ranking for queries performed by people in those countries. However, be aware that the advantage that .com holds over all other domain name suffixes is one of adapting to your visitor’s habits. People tend to assume that a site’s URL will end in .com, regardless of the fact that search engines typically don’t care one way or the other. In other words, if you don’t control the .com version of your chosen domain name, you could be losing the so-called type-in traffic (i.e., traffic that ensues when a searcher simply types their search directly into a browser’s address bar). If you can’t secure your chosen domain as a .com, then you may want to choose a different domain. Whoever controls that .com is going to end up getting some of your traffic. However, if your goal is primarily to rank in the engines, and you don’t mind that some traffic might bleed off when people inadvertently enter .com by mistake, then .org and .net domains are just as good, ranking-wise, and they’re generally easier to obtain. If your business is not based in the US, then it also becomes important to acquire a country specific domain name extension. For instance, if your business is based in the UK and you want your site to be found within Google UK’s “pages from the UK” search feature, then you must have either a .co.uk domain extension, or else your site must be hosted within the UK. However, even if you do have the country-specific extension, it’s still important that you also control the .com to avoid losing type-in traffic. You can then forward the .com version of your domain to your country-specific domain. Occasionally you’ll see domains with the .info extension. Our advice is to avoid this extension because it’s been heavily abused by search engine spammers. Such .info domains can be registered for as little as $0.99, making them a prime target for spammers looking for cheap, disposable domains. You can still build a high ranking site on a .info domain, and some people do—but our experience is that this domain-extension has been tainted from the SEPOV and is best avoided. An additional note about domain extensions: There is evidence that Google gives a ranking advantage to domain names with .edu or .gov extensions. However, these extensions are only available to recognized educational institutions or US government entities, respectively, and are off-limits to most of us (though they are extremely valuable sources for incoming links).

Review

In this lesson you learned:

  1. The importance of having the right domain name.
  2. How to choose the right domain name.
  3. The importance of choosing domain names that appeal to both customers and search engines.
  4. How search engines view the different domain name extensions in respects to rankings.

Congratulations, you are close to completing the book portion of this SEO course. Only two more lessons to go! …now you’re ready to learn how Web Site Architecture effects your ability to rank well in the search engines.

Top Web Directories

Best General Directories

DMOZ Add Free Could take up to a year or more for your site to be listed.
Yahoo Directory Add $299/year for commercial sites, $600/year for adult sites Your site will also be listed in several international Yahoo directories.
Best of the Web Add $149.95/year or $299.95 one-time. As name indicates, was once an award site.
HotVsNot.com Add 6 months = $40, 12 months = $60, 3 years = $150, Permanent = $200 or Free with a reciprical banner ad. A top-tier directory with hundreds of thousands of backlinks, an aged domain, high traffic and very recent Google Cache dates on interior category pages. It also has a growing user community and DOES NOT accept every site.
JoeAnt.com Add $39.99 one-time fee, free for editors Takes about a week to be listed either way.
GoGuides Add $69.95 per URL. Sites are instantly included. Money back guarantee if your submission is declined.

General Directories

Directory Add Cost Comments
Universal Business Listing Add $75 per year Excellent local search directory.
Rubber Stamped Add $39.95 one-time review fee, refundable if rejected. Nice, growing directory.
Skaffe Add One time $49.99 fee, or free but only for submissions done over weekends. All submissions reviewed and not accepted will be accessed a $10 processing fee Becoming an editor is not difficult, and is an easy way to get listed for free.
Massive Links Add Directory Listing is $39,99 with 4-links – Full Refund if you’re not accepted – one time fee
Premium Listing is $79.99 with 4-links – Bigger display above basic listings for one year – renew and keep premium listing or it defaults to regular directory listing
Non-profits are still free
Permits you to write your own keyword-rich link anchor text.
Starting Point Add $99 annual fee. Quality directory, many high-PageRank pages. Not widely known, so may be a link your competitors won’t have yet.
Gimpsy Add $49 to be reviewed within 72 hours, $20 refund if site is turned down. Free submit available, but can take months to be listed. Focused primarily on commercial sites. Can be difficult to find the proper category to submit to.
Web Beacon Add $39.99 one-time review fee, free for editors. Originally created from the GoGuides database, expanded since.
Site Snoop Add $15 one-time fee. Then you can add up to 4 additional URLs in your listing foran additional $5 per URL. Currently does not accept mortgage/loan sites. There tends to be a lot of spam in that industry, so this helps improve the overall quality of that directory.

Top Social Sites

Social Site Join Type Comments
Twitter Join Social Communication Share and discover what’s happening right now, anywhere in the world.
Facebook Join Social Networking Facebook helps you connect and share with the people in your life.
LinkedIn Join Social Networking Over 85 million professionals use LinkedIn to exchange information, ideas and opportunities
Google+ Join Social Networking Share updates, photos, videos, and more. Start conversations about the things you find interesting.
Digg Join Social News Still a top social bookmarking site. Hitting the front page can bring massive server-crashing amounts of traffic.
Delicious Join Social Bookmarking Links are nofollow but the site is still a good source for traffic and to connect with other social users.
StumbleUpon Join Social News Can be a great source of ongoing traffic.
Pinterest Join Social Image Bookmarking Now the #3 social bookmarking site behind only Facebook and Twitter. If you are in a image-rich industry, this is a must-use for everyday social marketing.
eBaum’s World Join Social Video High-traffic social bookmarking site perfect for link bait submission and blog post promotion.
Folkd.com Join Social Bookmarking A top global social bookmarking site that’s been around since 2006 links are nofollow but can lead to good traffic if you hit the front-page.
BuzzFeed Join Social News Breaking top traffic social news site. Provides an advertising option to drive traffic to highlighted blog posts or link bait pieces for initial viral buzz.
Reddit Join Social News Community groups on many varied topics.
BlinkList Join Social Bookmarking Bookmarking service, also allows you to save copies of whole web pages.

Top Blog Directories

Blog Catalog Add Free. Probably the biggest and most successful blog directory currently
Eatonweb Add $34.99 Good blog directory, even if you have to pay.
Blog Flux Add Free. Links to your blog homepage are redirected, but links to internal posts are direct.
Best of the Web Blogs Add $149.95/year or $299.95 one-time. Great directory, we recommend both a regular listing and a listing for your blog.
GetBlogs Add Free. Free, but they like you to link back to them.

Top Article Sites

Ezine Articles Add First 10 articles submitted are free. If they like your articles, they’ll let you submit unlimited articles for free.
GoArticles Add Free. Unlimited numbers of articles can be submitted for free. Links allowed in sig line but not within content. As of now they are no longer accepting payment for featured articles.
Article Dashboard Add $47 Unlimited numbers of articles can be submitted once you pay.
Buzzle Add Free. Unlimited numbers of articles can be submitted for free. Must apply to become an author. Buzzle.com no longer accepts marketing type articles including those with external links.
Article City Add Free. Unlimited numbers of articles can be submitted for free. No anchor text links, just URL links.
Idea Marketers Add Free. Unlimited numbers of articles can be submitted for free.
Article Alley Add Free. Unlimited numbers of articles can be submitted for free. Can take awhile for your application to be approved.
Article Gold Add Free. Article Submissions and New Registrations are being accepted again after a major site update. Sign up disabled until further notice.
Amazines Add Free. Unlimited numbers of articles can be submitted for free.

Listings on the Search Engines

Google+ Local Add Free You will need to do a search and locate your business’ local profile with Google. Once you’ve clicked through to its listing there will be a note: “IS THIS YOUR BUSINESS?” with a Manage this page button to click. You will then be asked to either sign into your own Google account and if you don’t have one then you will need to register with them first.To learn more about managing and boosting your local listings with the major search engines check out our book, Get on the map! The Complete Guide to Getting Your Local Business Online.
Bing’s Local Business Portal Add Free Locate your business within Bing’s search results using the URL we’ve listed in the Add column. Then you will be allowed to claim it once you’ve either logged in or registered with Bing. Be sure to read our Your Complete Guide to Bing’s New Local Business Listing Portal to get started on the right foot.
Yahoo Local Add Varies Free for a basic listing or $9.95 a month for an enhanced listing.

Internet Yellow Pages (IYPs)

SuperPages.com Add Free Be consistent with ALL the data that you have with all your listings. Especially when you’re listing with any Internet Yellow Pages because many businesses will pull your data from these places and if it’s incorrect it will negatively effect your local rankings.
Yellowbook Add Free You first need to register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Yellow Pages Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile. You will be asked to register for their site or you can opt to login with your Facebook login profile. We suggest creating a separate profile for any business listing that you’re doing for clients. It’s even advisable to create unique login details for your own business in case you outsource the management at a later date.
dexknows Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile. You will be asked to register for their site.
Yellowbot Add Free Start with a search for your business using your business’s phone number at the bottom of the homepage. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile. You will be asked to register for their site.
White Pages Add Free This site is currently powered by Yelp.com with the map provided by Bing so if you have a listing with Yelp.com then you’re good to go. Start with a search for your business. If it does come up then you will be able to click through to Yelp to edit it as needed. If it doesn’t come up then click the Add link we’ve provided you to get it registered.

Best Local Directories

CitySearch Not taking new additions at this time. We will keep you posted. Free This site is a very important data source for local search. It feeds many other sites information in businesses. This site is powered by CityGrid so if you have a listing with them then you’re good to go. Start with a search for your business. If it does come up then you will be able to click through to edit it as needed. If it doesn’t come up what you do next is something we’ve contacted CityGrid about and are waiting to hear back. We’ll update you here as soon as we know more.
Yelp Add Free Yelp powers Bing’s Local Listings so this is a site you cannot afford to ignore! You first need to register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Merchant Circle Add Free You first need to register as a user and then as the owner you can add your local business to their listings. Always start with a search of their site for your business because they have most businesses already pulled in and unclaimed.
MojoPages Add Free First, you will need to register as a user. Then you can do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Local.com Add Free You first need to register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Kudzu Add Free First, register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
HotFrog Add Free You can create your profile right away then access it anytime to update your products and services, add images and announce special offers.
Manta Add Free You can add your company right away without going through the registration process. Be sure to search their site for your business first so that you don’t create a duplicate listing.
Insider Pages Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Best of the Web Local Add $9.95/month Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Zip Local Add Free First, register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Brownbook.net Add Free Start with a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
City Data Add Free This directory is only for brick & mortar businesses, no online businesses. You must have a real street address and business phone numbers. Make sure your profile is as in-depth as possible and provides unique information about your business. Do not simply copy a description you already used elsewhere (for example on your Web site). Add photos. Generic profiles, not providing interesting information about your business will be periodically purged without any warning.
Tupalo Add Free First, register as a user. Then do a search for your business. If it doesn’t come up in the search then they will allow you to Add it to their listings. If it does come up then claim it as its owner and complete the profile.
Skip to toolbar