How have Search Engine’s Ranking Signals Evolved?

  1. 1996 – 1999            Rankings based on onpage keyword use and meta data
  2. 1999 – 2002           PageRank and onpage keywords
  3. 2002 – 2005          PageRank, onpage, domain name and anchor text of link portfolio
  4. 2005 – 2009          PageRank, onpage, domain name and anchor text, diversity of linking domains, domain authority and topic modeling

How will search engine ranking signals evolve in the future?

In 2010 we saw the inclusion of Facebook and Twitter signals as ranking factors. In the big hooha between Google and Bing recently, Bing admitteded using click stream data as a ranking signal – this means making the rankings match the sites users actually find and stay on for certain keywords (and Google’s been doing it more subtly for the past couple of years).

Although there are a number of directions the search engines can go I feel we are at one of those crucial moments where the changes that are made now will set the direction of the search engines for the next couple of years.

I believe there are going to be a few main areas of focus:

1. Bigger Brands

Since the Vince update in 2009 Google has been favouring brands, this is because there was a massive increase in junk sites out there, sites that weren’t useful or sites that where rankings that simply weren’t relevant.

How does Google determine is a website is a brand?

Google and Bing are going to use different methods, however by and large there are a number of resources out there that both will check to see if a site is a brand site, these may include:

  • Does the website have a Linkedin Page – and do people say they work there?
  • Is the site linked back to from the Profile of a followed Twitter account?
  • Does the site have a Facebook company page?
  • Does the site have contact pages and contact information?
  • Does the site have an About us page/section?
  • Is the business registered anywhere? Ie Companies House / Inland Revenue etc
  • Do people search for the brand? (This will be backed up by checked on those searches as well as making sure they are organic patterns).
  • Is the business carrying out marketing campaigns? That means clicks from emails, articles, blogs, mentions of websites in press releases etc

That may not be the end of it, there are other things that should be considered such as how certain keyphrase areas are associated with brands, this means some analysis of clickstream behaviour. (View point 3)

2. Named Entities

OK so bear in mind what I said then about brands, now take a step to the side a bit. Search engines are not only trying to understand when people are seaching for a brand, they also need to know when a searcher is searching for a person, place, product or thing.

Remember that Google bought out Metaweb. Metaweb basically catalogs different words and associates them with other words creating a massive web of associated terminology, this way they can easily recognize that when someone is searching for a product eg Debt Management, they are also looking for information about Bankruptcy, IVAs as well as specific providers within the sector, another example is that when someone searches for a town, let’s say Brighton they may also be looking for a bit of information about the hotels in Brighton as well as the weather in Brighton.

Only by understanding the associations of named entities can Google trigger elements of universal search such as Maps, Videos and Product search. Obviously from Google’s point of view the better it can get at this aspect of things the more advertising revenues it can get.

3. User Behaviour

One of the biggest growth area is probably going to be a more explicit use of user behaviour as ranking signals.

As discussed we’ve seen recently that clickstream data is being used by the search engines to determine rankings,.

Google has been playing around with this model for a while now with elements such as personalization for account users, sidewiki, searchwiki etc it’s like that this will ,be perfected more in the next year or so (including Google’s most recent addition of the blog site from SERPs extension for Chrome).

As user data become more prevalent it’s likely that it will become harder to game the system with tactics such as link building and I can’t imagine a time that software will be created to mimic human onsite behaviour.

4. Manual Quality Checks

Bing freely admits it has a team of real people that manually intervene in the search engines results and Google also have a team of shadier quality engineers. There seem to be a number of rankings that you get dropped to if you’re caught doing something that’s even a bit naughty. This will almost certainly be something that will grow as the internet develops and increases in size.

Leave Comment

Your email address will not be published. Required fields are marked *


est. 2009

Sign Up For Our Newsletter