You may already be familier with Google’s Patent that modifys the search results when they pick up an SEO technique (if you’ve not take a look at SEO by the Sea). The new patent is essentially designed to confuse people using spamming techniques by making those techniques difficult to report on. Rand Fishkin in his private blog seems to suggest that this is solely based on link spamming, however having read the patent I’m nnot convinced this solely relates to links.
Basically, having already ranked a page, Google then recognises the techniques that have been implemented to improve the rankings (either on page or offpage, legitimate or super spammy), when it next indexes the page Google will then randomly change the way it ranks the page – reading between the lines that says to me Google will decrease the rank by random amounts in order to get data on the next actions the page owner takes to attempt to positively affect the results. As Google puts it:
“For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results.”
We know Google are constantly changing their ranking algorithms and this is just part of their arsenal. This one does seem a lot smarter than we’ve seen before from Google and certainly seems to be nailing their colours to the wall in terms of their position on SEO. This is all about taking a psychological approach to preventing SEOs from spamming. I also think it makes reporting on SEO in the short term much more difficult, which also means that the way SEO is carried out in agencies (ie on a monthly reoccurring payment) is going to have to be reviewed by many, as unless you’re dealing with a big website with lots of pages you will have to stop for a prolonged period of time in order to be sure you are able to report on SEo accurately.
As discussed this isn’t just about links – it’s also about the way you place keywords in the website and you might not see a negative impact, you might also see a positive impact. Google will then monitor your reactions to those positive results which will probably be doing more of the same throughout the website – this then could see the whole site being dropped down the rankings for certain keywords in the long run making it difficult to determine what has had the impact.
More interesting is the way in which Google states that if there is a suspicion of spamming (and I question here if we are talking about manual or automatic intervention), Google will then apply large fluctuations to the rankings in the hope that these kind of extreme changes to rankings will get the optimiser to take an action they believe will correct the results – these changes will likely make it more obvious to Google what’s going on and may even alert Google to the existence of sites selling links or serving links to other websites.
It’s all about Google gathering data and getting more understanding about the SEO communities action – they’ll know the techniques we’re about to implement before we’ve even made the move!
Interestingly Matt Cutts has given us an indication that there’s going to be a new Penguin Update and it sounds to be as though this is going to be the most severe update we’ve seen so far! As an SEO consultant working both as a freelance seo consultant and a contracted SEO consultant I am certainly awaiting the next Penguin update with excited trepidation as I think there are going to be some really interesting results that emerge that make things quite a bit different to guage.
It’s certainly interesting times – again!