Google and The Link Spectrum

Tweet about this on TwitterShare on Google+Share on LinkedInShare on FacebookPin on PinterestShare on StumbleUponShare on RedditDigg this

Google’s most recent major algorithm change has now had a week or so to settle in and its effect on the ranking of some websites is starting to reveal more about Google’s aim with this latest update. They never reveal the detail of exactly how their algorithm works but the major SEO commentators who observe the changes in rankings of websites before and after a major algorithm update are adding more information and I wanted to share a key point with you.

 

As always Google has looked at ways to penalise websites that are engaging in un-ethical (or “black-hat”) ways of manipulating the search engine rankings because, sadly, some such sites are still benefiting from these techniques, which include over-optimising and building massive quantities of links from non-reputable sites. The update, conversely, will reward sites that use ethical (or “white-hat”) techniques to promote their site to a high-ranking position.

 

One of the key factors in this latest update (dubbed “Penguin”) is the pattern of links that have been created. Anyone involved in SEO will be aware that Google have for some time been advocating a natural pattern of link building, which is why reputable SEO experts manually build links from a range of high-quality websites.

 

What the Penguin update appears to do is analyse various factors of the links in order to categorise them on how natural they appear to be. These factors are:

 

  1. main keyword use in link anchor-text –over-use of the main keywords would indicate an unnatural pattern of link building
  2. proportion of links coming from unrelated sites – a high proportion from unrelated sites could indicate a lack of relevancy
  3. percentage of URL based links – too few could seem inappropriate since links generated naturally would certainly include a fair percentage of URLs

 

Of course a website has no control over links generated truly organically and cannot control where all links come from so every site is likely to have some undesirable links and the new algorithm would seem to recognise this fact by using a spectrum of acceptable linking patterns. How close a website is to the ethical end of this spectrum would, of course, be preferable but whether the ranking position can be directly correlated with a website’s position along the acceptable spectrum of linking patterns remains to be seen. I wonder is anyone can actually come up with that data?

 

But, it is clear that the aim of every website and business who value their online reputation will be seeking to be as close as possible to the ethical end of the range.

 

This implies that link-building campaigns will not go away but will merely be focussed on building up real relationships with other relevant sites (avoiding competitors, of course) in order to get real recommendations, which is what Google started out trying to measure in the first place.

 

None of our clients have been adversely affected by the Penguin update but if you have why not share your experience with us…

More to come on how to re-focus your link building strategy next time…

 

By

Tweet about this on TwitterShare on Google+Share on LinkedInShare on FacebookPin on PinterestShare on StumbleUponShare on RedditDigg this

You may also be interested in...

One thought on “Google and The Link Spectrum

  1. Google are constantly changing their algorithm whether via major updates such as Penguin or the drip-feed of more minor changes. Stick to quality content throughout your website and keep it fresh and you can weather whatever Google throws at you, IMO

Leave a Reply