In my first post regarding SEO Myths, I felt it better to leave the list of myths at about 10. By trying to list all the ones I’m familiar with (either through work or study), I was at risk of reaching retirement age before finishing!

Seriously, there are so many circulating in cyberspace that it’s amazing us SEO’s aren’t vilified and shunned by web designers, clients, programmers & employers.

As promised, here are a few more:

  1. Having an XML Sitemap will boost your Google rankings – all an XML sitemap will do is list all your site’s pages in one place, helping search engines find and index them. Rankings will depend on what Google finds on these pages.
  2. Having country-specific sites creates “duplicate content” issues in Google – Google, Yahoo & Bing et al know how to identify a country of origin by IP address. And, since these three sites all have multiple country-code TLD websites (ie. a .com, .co.uk, .co.jp etc.), wouldn’t they have fallen foul of a duplicate content rule like this years ago?
  3. Googlebot doesn’t read CSS – they do, which is bad news for black-hat SEO’s who still think that hiding text by matching it to the background colour of a page is a justifiable trick.
  4. Linking out (such as to Google.com) helps rankings – failure to link-out makes Google think you’re trying to hoard PageRank, and yes, that could have a negative impact. But since it’s the INBOUND links rather than OUTBOUND links which have the impact, linking out isn’t going to have much influence.
  5. Italicizing or boldening words encourages Google to focus on them and rank your site more highly for these words – nope. Bold, italics, underlines etc is more for human eyes than for Googlebot. Take this article as an example – I’d be risking ranking for some ridiculous keywords if bold tags were a deciding factor!
  6. The “Disallow” directive in robots.txt can get (and keep) pages out of Google’s index – the robots.txt file isn’t rewally an essential part of SEO (though a correctly-coded one does no harm). Use the “noindex” meta content code on a page-by-page basis, as required.
  7. Google will not index search results pages on your website – when I worked at First Internet, this misconception (nullified during 2008, by the way) from a previous employee led to some pretty messy Google Analytics reports, and an impact on SEO performance that was worrying. A quick chat with the web design team had them adding the “noindex” meta code to search results pages, and the problem was quickly resolved.
  8. Placing links in small sized font at the bottom of your homepage is a good way to raise rankings – this tactic is a waste of time. Plus, it looks silly and amateurish.
  9. It’s important to change the links on your homepage as often as possible – not the case. Many sites have been successful without doing this. Whilst changing them may be nice for web surfers, constantly changing these links is a waste of effort which could be put to better use elsewhere.
  10. Tidying up URLs on your site is “cloaking” – if you’re just making them tidier (by removing session ID’s, or URLs generated by an automated CMS), there’s no problem. Be sure to make sure your 301 redirects are all working, however: one broken 301 could lead to thousands of broken links!