Latest Entries »

YouTube Video SEO Case Studies

At First Bathrooms, we’re currently working on a series of videos to showcase our bestselling products. As SEO Manager, I’m also working on the search engine optimization of these videos.

This case study will be published once completed. In the meantime feel free to check out some of the videos that will be forming the case study:

More on this at a later date!


Twitter For Business

Search marketing doesn’t begin and end with keywords and organic search results. An effective strategy blends as many aspects of SEM as possible to create the perfect recipe for customers.

To use the cooking analogy further, the exact amounts of each ingredient will vary according to your business model, the products and services you provide, and the consumers (or “potential customers”) you’re targeting.

Let’s look at a straightforward “checklist” that I recommend to clients:

Consistent Branding

Every time your target market visits your Twitter page, you want them to be able to instantly know that this is your Twitter page, rather than an competitor or private Twitter user. Ensure your logo and any corporate colours are present (your profile pic should ideally be your logo, and if you need to modify it to fit the box, keep it as close to the original as possible.

When creating a background, it’s worth knowing that many companies create a customised background (which can be re-used on other social media profiles too). Many choose to include an address or a “corporate message”. Just be aware that few (if any) will read the content of your customised background – it’s purpose is to help visitors realise whose Twitter page they’re on.

When you design a background, remember that Tweets appear in a centralised column on PC screens, so any key information needs to be towards the edges. You might need to view your Twitter page on a couple of different-sized screens just to be sure users can see everything.

Twitter Usernames and Bio

Again, you want these to be instantly recognisable as being yours.

Many opt to use the company name as their username (the “@xxxxxx”). Others may choose something related to their product or service. Remember that Twitter usernames should be:

  • Easy to type from memory
  • Short and sweet
  • Distinctive, and
  • Unlikely to infringe any trademarked brand names

When writing a biography, keep it short and sweet, including SEO-related keywords if possible (your SEO staff or agency will be able to help with this).

When choosing a username, bear in mind that spam accounts are usually a name followed by a sequence of numbers (for example @ crazyfred1234567). Avoid usernames like these, even if someone recommends including your phone number in your Twitter username.

When & What To Tweet

There are any number of schools-of-though regarding Twitter “Netiquette”. Always remember that the tone, language and content of your tweets will form the “official statement” in the eyes of many. If your social media administrator is tweeting in a laughing, joking manner, this could backfire if someone sees a casual tweet as some official opinion.

Short and sweet is the safest option, in my eyes. The less said, the less there is to misinterpret.

In terms of regularity of tweeting, many companies tweet multiple times per day, if only to remind the web that the TwitMaster General is at their desk. Tweeting a special offer, new product or service, blog post, or retweeting an interesting news story (which relates to your industry) is a good way to encourage interaction. And of course, interaction is a key part of tweeting.

Choosing What To Tweet About

Twitter includes a list of “trending topics” on every users homepage, and this can be customized from “Worldwide” right down to city-level in many cases.

Trending Topics on Twitter

Twitter provide a list of customizable “trends” on user’s homepages.

As topics on Twitter change rapidly, the results can seem a bit capricious at times. One of my favourite tools for checking which topics on Twitter are popular is This site allows users to view trending topics on a map (using data supplied when Twitter users register and/or GeoIP locating). Tweets update in near real-time, and users can also see which Twitter users are popular, not just subjects & hashtags. Twitter trends shown in map view Twitter trends shown in map view.

What Not To Tweet

It’s easy to get carried away with Twitter. Businesses should always remember that a line must be drawn between the official tweet and something fun which could go too far. Always remember:

  • An inadvertent tweet containing contact details or simply the name of a member of staff can backfire if that staff member is inundated by calls, tweets, emails of Facebook friend requests,
  • A controversial opinion, tweeted with the best intention, could end up harming your reputation, especially as such faux pas are generally remembered months or years later,
  • Although it might sound silly, photos of staff parties could be a step too far if those in the photo are “a little worse for wear”, looking or acting unprofessionally, or simply unhappy that their photo has been publicised as part of your social media strategy,

SEO Myths – Part 2

In my first post regarding SEO Myths, I felt it better to leave the list of myths at about 10. By trying to list all the ones I’m familiar with (either through work or study), I was at risk of reaching retirement age before finishing!

Seriously, there are so many circulating in cyberspace that it’s amazing us SEO’s aren’t vilified and shunned by web designers, clients, programmers & employers.

As promised, here are a few more:

  1. Having an XML Sitemap will boost your Google rankings – all an XML sitemap will do is list all your site’s pages in one place, helping search engines find and index them. Rankings will depend on what Google finds on these pages.
  2. Having country-specific sites creates “duplicate content” issues in Google – Google, Yahoo & Bing et al know how to identify a country of origin by IP address. And, since these three sites all have multiple country-code TLD websites (ie. a .com,, etc.), wouldn’t they have fallen foul of a duplicate content rule like this years ago?
  3. Googlebot doesn’t read CSS – they do, which is bad news for black-hat SEO’s who still think that hiding text by matching it to the background colour of a page is a justifiable trick.
  4. Linking out (such as to helps rankings – failure to link-out makes Google think you’re trying to hoard PageRank, and yes, that could have a negative impact. But since it’s the INBOUND links rather than OUTBOUND links which have the impact, linking out isn’t going to have much influence.
  5. Italicizing or boldening words encourages Google to focus on them and rank your site more highly for these words – nope. Bold, italics, underlines etc is more for human eyes than for Googlebot. Take this article as an example – I’d be risking ranking for some ridiculous keywords if bold tags were a deciding factor!
  6. The “Disallow” directive in robots.txt can get (and keep) pages out of Google’s index – the robots.txt file isn’t rewally an essential part of SEO (though a correctly-coded one does no harm). Use the “noindex” meta content code on a page-by-page basis, as required.
  7. Google will not index search results pages on your website – when I worked at First Internet, this misconception (nullified during 2008, by the way) from a previous employee led to some pretty messy Google Analytics reports, and an impact on SEO performance that was worrying. A quick chat with the web design team had them adding the “noindex” meta code to search results pages, and the problem was quickly resolved.
  8. Placing links in small sized font at the bottom of your homepage is a good way to raise rankings – this tactic is a waste of time. Plus, it looks silly and amateurish.
  9. It’s important to change the links on your homepage as often as possible – not the case. Many sites have been successful without doing this. Whilst changing them may be nice for web surfers, constantly changing these links is a waste of effort which could be put to better use elsewhere.
  10. Tidying up URLs on your site is “cloaking” – if you’re just making them tidier (by removing session ID’s, or URLs generated by an automated CMS), there’s no problem. Be sure to make sure your 301 redirects are all working, however: one broken 301 could lead to thousands of broken links!

In a surprising admission, Google admitted that they’re still holding some of the “stolen” wi-fi data acquired by their StreetView cars. Although they’ve attempted to mitigate this fact by stating it’s actually a small portion of what was originally gleaned illegally, reaction hasn’t been particularly charitable.

Google Streetview Camera Car

Google’s controversial StreetView data logging has become a drama that may dwarf even Coronation Street!

Google’s Great Wi-Fi Data Theft

Initial controversy regarding Street View began when it was realised that images of public drunkenness (including urination), arrests, the location of domestic violence shelters and people leaving places such as adult theatres was likely to be published online. Other concerns centred on images of military bases (which included the British SAS base in Credenhill, despite the Regiment’s presence in Hereford being far from secret) potentially revealing locations and possibly “secret details”. In most (if not all) cases, Google complied and removed such information.
A short time later, it was revealed that the Streetview cars had obtained the details of “open” (unencrypted) wi-fi hotspots. Although admissions in April 2010 claimed that this data was harvested “by accident”, revelations soon surfaced that the software was devised by an engineer in 2006, and had been recording such data for years. It’s also believed that e-mails were downloaded, violating some people’s privacy.

The resulting enquiries, including those by the Information Commissioner in the UK, didn’t do too much damage initially, however Google were ordered to delete this data. Their admission on July 27th that some of this data still existed added further fuel to the pyre. Sceptics and opponents of Google and their “Do No Evil” motto will be quick to jump on this admission, also using it to attack the ICO’s perceived lack of action the first time.

Why Would Google Steal Data?

Many companies who offer open wi-fi hotspots, or those accessible through a simple registration process stood to gain from the public knowing where these hotspots can be found (Tesco & McDonalds make it clear that free wi-fi is available in their UK locations).

However, by logging the location of private wi-fi locations, by which I mean households and businesses who restrict access to residents and/or employees, Google had the potential to use the absolute mountain of data to potentially target mobile phone adverts, and perhaps even create new products based on this data. See Jason Lewis’ article in the Daily Mail from 2010.

Those of us with Android smartphones will know that location-based apps like Google Maps recommends switching on wi-fi to aid location plotting – by matching the strength of mapped wifi locations to approximate locations.

Whilst some organisations such as Experian Hitwise record the web-surfing behaviour of the population, and make this data available to fee-paying customers to shape their online marketing, that data is anonymous (in most cases, data only identifies the town/region, and the age bracket that the registered ISP customers fall into) and says little more. By knowing the browsing habits of those at a specific address, the value is obvious.

Had this data-gathering remained within Google’s walls, I suspect that use of it would have needed to be extremely careful. If they’d suddenly created a product that appealed so specifically to the public, Brin, Page and Schmidt would hardly be able to claim that crystal balls were being issued to in-house clairvoyants…

And let’s not even try to imagine the furore if the login details of private broadband routers had been hacked and published online…

Rumours, Allegations & Conspiracy Theories

So far, nobody has proven that any data that Google took from UK residents has ever been used. In other countries, however, investigations have shown that some very specific personal data was recorded – and of course that poses the question of why would this data be harvested, if not to be used or sold?

Some have already drawn parallels with the Phone Hacking Scandal, citing a link between David Cameron and Google (through the wife of one of his former advisors, Steve Hilton, and a number of meetings between Cameron & Google). The blatant rumour here being that some senior Ministers at “UK, plc” may have known that our data was up to the highest bidder. The Tin-Foil Hat Crew will doubtless have connected the dots and assumed that Whitehall would use the data in the pursuit of a Big Brother-style data repository on us.

If Google StreetView perfects mind-reading, will this become essential fashion?

Keywords and keyphrases are major factors in an SEO campaign – let’s face it, that’s how we target our potential market and attract customers.

So, we’ve identified our market. We’ve got a website ready. And after using keyword tools like Google AdWords, we’ve got a list of a few hundred potential phrases. How do we select the best ones?

I’ll explain by using the example of a story:

A guy walks into a nightclub. At the bar are three women. Woman A is the most beautiful girl he could ever imagine seeing. Woman B is very attractive, but perhaps not quite in the same league as Woman A. Woman C is unlikely to get any numbers given to her.

The guy has a decision to make – who should he chat up? If he walks over to Woman A, she might accept his number, but since every other guy in the bar has done the same thing, his chances are slim. Let’s also bear in mind that some of the other numbers she’s got are men with big houses, massive bank accounts, expensive cars etc etc.

If he approaches Woman B, he knows that fewer men will have chatted her up (bear in mind she’s sat next to Woman A), so his chances are better.

If he approaches Woman C, well, he won’t leave the bar in one piece…

This illustration, odd though it sounds, applied to selecting your keywords for an SEO campaign.

It’s all very well aiming high and going for the highly competitive keywords, but you’re in competition with the big companies in your marketplace, and success is likely to be slow.

Selecting a spread of keywords is the better option. Targeting some big keywords is fine, but these are unlikely to bring the immediate ROI that ever business is seeking. Choosing ones with less competition offer a better chance of achieving the higher rankings and a higher percentage of click-throughs.

Statistically, the older a website is, the more backlinks it will gain. As your site gains more, you’ll be in with a better chance when targeting the highly competitive keyphrases that are likely to give you the boost that you want.

Feel free to check out my page on keyword research.

SEO Myths – Part 1

With SEO being one of those new apparently “controversial” professions, it’s unsurprising that a number of myths float around, confusing the hell out of companies, other SEO’s and probably even Google themselves.

Sadly, despite our best hopes, some misconceptions refuse to die, rather like a groaning zombie scaring the living daylights out of us when we just want some peace and quiet.

SEO myths are like zombies – undying, annoying and not particularly loveable

So, before I run around with some unlikely weapon trying to save mankind from an invasion of the undead, here are the first of many classic myths surrounding SEO:

  1. SEO is a black art – SEO is a thriving industry. If it were illegal, why would Google publish guidelines?
  2. High PageRank = high rankings – PageRank is an analysis of backlinks. Like many aspects of SEO, it’s simply one piece of a much larger jigsaw puzzle. I once optimised a page with a PR of between 2 and 3 to outperform entire sites whose homepage PR was 5 and above, and I’m sure many other SEO’s have done similar things.
  3. H1 tags are a crucial element for SEO – not always. I recommend that they are included for usability reasons, and should be restricted to a single H1 per page.
  4. The personalised search results that Search Engines provide mean SEO is now irrelevant – not true. Although some search results show minor differences, it’s negligible. There are ways to work around this, too.
  5. Meta tags will boost your rankings – a load of twaddle. In the early days, sites abused Meta tags to try and influence site rankings. Google got wise to this very quickly, so Meta tags have pretty much no influence. At best, some directories may use them to classify your entry into their listing, and even this is more usability than SEO.
  6. You should end your URLs in “.html” – boring, slow and not relevant.
  7. If you define a Meta description, Google uses it in the snippet – I tested this myself at in 2007 and disproved it to colleagues. Many others have also reached the same conclusion.
  8. SEO is a one-time activity – the search landscape is constantly changing, and Google regularly updates their algorithms. So no, you can’t do it once and sit back. It’s like painting the Forth Railway Bridge in Scotland: once finished, it was time to start again.
  9. Keyword density is key – no, no and double-no. You should ALWAYS write content intended for human eyes, not Googlebot. By writing content that reads naturally (which Google recommends, and so do reputable SEO’s), you’ll remove the need for focusing on this.
  10. Great Content = Great Rankings – as I said earlier, content is just one piece of a much larger jigsaw puzzle. One lovely shiny piece matters little if the other pieces are damaged or missing.
  11. Google uses their Analytics software to obtain information about you or your site users – anyone who tells you this probably wears a tin-foil hat to stop the CIA reading their mind. Google don’t do this, and they’ve said as much. If they were using the data maliciously, someone would have blown the whistle by now.

Suffice it to say, there are far more than these ten. And however many I’ve found, I’m sure there are plenty more.

A good SEO will talk clients and employers through any misconceptions, and provide official statements of provable, third-party evidence (ideally from their own professional experience) to dispel these misunderstandings. In some ways I feel like I need to help bust these rumours before the day arrives that we SEO’s become social pariahs.

Google Panda Update 3.3 Announced

Earlier, Google announced on their “Inside Search” blog that amongst the 40 search updates being rolled out for February 2012, Google Panda was receiving a tweak to bring it up to version 3.3.

Google Panda was one of those updates that was always going to raise controversy amongst website owners, particularly those relying on adverts for revenue.

Google Panda

Google declared on "low quality" content with their Panda update - sites with heavy on-page ads got burned.

The update heralded a very noticeable change in Google’s search results 12 months ago. Many of us SEOs spotted that “content rich” websites like news websites, social networking sites etc were suddenly crawling up the SERPs, whilst websites whose on-page advertising was considered excessive were gradually being downgraded.

In some ways, the Panda update was likely to be welcomed by many. Web users usually dislike excess advertising on web pages, not only because it can spoil the look and feel of a page, but the individual feed request each ad must request on page-load leads to pages loading more slowly (which is a ranking factor in Google, despite what some may claim…). And the annoyance of losing your page if an ad is accidentally clicked is something we’ve all experienced at some point.

In other ways, many websites whose revenue is dependent on these ads have since seen their rankings, visitors and ultimately revenue fall sharply. Sadly, this will certainly have impacted on the website where I began developing my SEO skills: It would also impact on new entrepreneurs who may choose to create a new website and offer advertising as their source of revenue. Fortunately, the website I built whilst at ReviewCentre was closed down (by myself) before the Panda started prowling round…

One theory I had was that one effect of the reason would be that websites displaying ads from platforms like Microsoft AdCentre and Yahoo Search Marketing might shift focus to Google ads instead, believing that sites using a Google advertising product wouldn’t be penalised by Google. With just 12 months of data available, commentators will now have a much clearer idea how many abandoned their old ad providers for favour of Google…

Ever since Google announced impending changes to their privacy policy, controversy has been flying about the perceived “threat” to personal data.

The Google Privacy Policy Change – What Does it Mean?

The change to the Google Privacy policy essentially means that login data gleaned from “cookies” created when logging into Google’s products (including Gmail, Google Analytics, Blogger, YouTube and Panoramio) will be shared across all Google platforms, essentially tailoring personal search results for individuals, and allowing a single login to allow access to your accounts across Google.

Viviane Reding, Vice-President at the European Commission and European Commissioner for Justice, Fundamental Rights and Citizenship, was quoted today say that “transparency rules have not been applied” by Google (BBC News and Digital Trends are just two sites referring to her statement.

Google has stated more than once that it believes the new policy complies with European law, and have pledged to implement the changes regardless or Reding’s statement.

Now, call me “devil’s advocate” if you wish, but it was always my belief that Google shared data across it’s platforms from time-to-time. I initially suspected this when I logged into my Gmail account, and could switch to Analytics, YouTube and Panoramio without having to log back in. As I set up more set up more accounts, Google spotted this and would automatically link my accounts, noticeably when I joined Google+ last year and my existing Google profiles were automatically listed on my profile.

Now, I don’t see this as a bad thing. Sharing my login details saves me time, allows followers to find me easily, and just makes things more convenient. And, since I always share the same details when joining websites, I’m hardly worried about my details being used by Google. Yes, I’d be unhappy if Google entered a partnership with another company and shared my details, but this is surely covered by Data Protection Acts in the EU and beyond?

Many of us will have accounts on many websites such as Twitter, Facebook, LinkedIn et al, but do we consider that, unless we use pseudonyms to hide our true identity, people such as spammers, stalkers and some of the more annoying web users who are out there can all find us if they want to. I’ve had spammers on my personal blog, and earlier today someone sent me a less than welcome photo direct to my personal e-mail, presumably by making an educated guess as to what my e-mail might be.

My personal opinion is that the New Privacy Policy is hardly an issue on the scale of the backlash I’ve seen on the web. I’d even argue that the perceived or theorized risks to our personal details are unlikely to transpire on the scale that some believe.