Search engine optimisation techniques are constantly evolving. This evolution is in response to the changes of search engines such as Google, Yahoo and MSN. Google in particular has come to be seen as the most sophisticated and advanced search engine.
Optimising websites for Google is becoming harder and harder. It's now not just a case of adding keywords into your various HTML tags, uploading your files and waiting for the results.
This type of optimisation, commonly referred to as on-page optimisation, will only ever be 20% effective at achieving rankings for keywords that are even mildly competitive. Those of us who aced maths in school will know this leaves us with 80% unaccounted for.
This 80% corresponds to off-page optimisation. Off-page optimisation is all to do with:
Off-page optimisation is now for sure the overwhelmingly dominating factor which decides where a site will rank in Google. That then is what the 80/20 rule means.
What's the logic behind this then? Why does Google give so much weight (80%) to off-page optimisation efforts and so little (20%) to on-page optimisation? Well, simply put it's all about the quality of their results.
On-page optimisation is completely controlled by the webmaster and can thus be abused by an unscrupulous one. Off-page optimisation is controlled by other webmasters, websites and indeed the Internet in general. This means that it's much harder to conduct underhand or spammy off-page optimisation methods in the hope of gaining an unfair advantage in the Google result pages.
Let's say Google sees a link from site A to site B, with the link text, 'data recovery london'. This means that site B has just become more relevant and thus likely to appear higher in the rankings when someone searches for 'data recovery london'.
Site B has no control over site A (in most cases) and Google knows this. Google can then look at the link text and say to itself, why would site A link to site B with the specific words 'data recovery london' if site B wasn't about 'data recovery london'?.
This is however in most cases because webmasters can have multiple websites and cross-link them with keyword rich anchor text. There's only so many sites and cross-links any webmaster can manage though, so as the number of inbound links with keyword-rich link text increases, so too does likelihood that 'real' cross-linking is taking place.
Imagine hundreds or even thousands of sites all linking to a website X with variations of 'data recovery london' phrases as the link text. Google can be pretty sure that website X is about 'data recovery london' and feel confident about placing it in the top 10 results. Off-page ranking factors such as links are simply the most reliable way of checking what a site is about and indeed how well it covers what it's about.
This reliance on hard to cheat off-page factors is what produces the quality search results we all know and use everyday.
The moral of the story from an SEO point of view is to spend less time on those little website tweaks which you think might make a big difference (but won't). What really counts is how the web 'sees' your website. The more quality (keyword-rich) incoming links your website has, the better the web's view will be and therefore the better Google's view of your website will be.