How have Search Engine’s Ranking Signals Evolved?

  1. 1996 – 1999 Rankings based on on-page keyword use and metadata
  2. 1999 – 2002 PageRank and onpage keywords
  3. 2002 – 2005 PageRank, page, domain name and anchor text of link portfolio
  4. 2005 – 2009 PageRank, on-page, domain name and anchor text, diversity of linking domains, domain authority and topic modelling.

How will search engine ranking signals evolve in the future?

In 2010 we saw Facebook and Twitter signals as ranking factors. In the big hoo-ha between Google and Bing recently. 

Bing admitted using clickstream data as a ranking signal – this means making the rankings match the site’s users actually find and stay on for specific keywords (and Google’s been doing it more subtly for the past couple of years).

Although there are several directions the search engines can go, I feel we are at one of those crucial moments where the changes that are made now will set the direction of the search engines for the next couple of years. I believe there are going to be a few main areas of focus:

1. Bigger Brands

Since the Vince update in 2009, Google has been favouring brands because there was a massive increase in junk sites out there. 

Sites that weren’t useful or rankings that simply weren’t relevant.

How does Google determine if a website is a brand?

 Google and Bing will use different methods; however, by and large, there are several resources out there that both will check to see if a site is a brand site.  These may include:

  • Does the website have a Linkedin Page – and do people say they work there?
  • Is the site linked back to from the Profile of a followed Twitter account?
  • Does the site have a Facebook company page?
  • Does the site have contact pages and contact information?
  • Does the site have an About us page/section?
  • Is the business registered anywhere? I.e. Companies houses / Inland Revenue etc.
  • Do people search for the brand? (This will be backed up by checking those searches and making sure they are organic patterns).
  • Is the business carrying out marketing campaigns? That means clicks from emails, articles, blogs, mentions of websites in press releases etc.

Correlation between Word Count and Ranking in Google

That may not be the end of it; other things should be considered, such as how key phrase areas are associated with brands; this means some analysis of clickstream behaviour. 

(Viewpoint 3)

  1. Named Entities

OK, so bear in mind what I said then about brands, now take a step to the side a bit.  Search engines are trying to understand when people are searching for a brand. However, they also need to know when a searcher searches for a person, place, product or thing.

Remember that Google bought out Meta web.  Meta web catalogues different words and associates them with other words creating a massive web of associated terminology. 

This way, they can easily recognise that when someone is searching for a product, e.g. Debt Management, they are also looking for information about Bankruptcy. 

IVAs and specific providers within the sector, another example is that when someone searches for a town, let’s say, Brighton, they may also be looking for a bit of information about the hotels in Brighton and the weather in Brighton.

Only by understanding the associations of named entities can Google trigger universal search elements such as Maps, Videos and Product searches. Obviously, from Google’s point of view, the better it can get at this aspect of things, the more advertising revenues it can get.

3. User Behaviour

One of the most significant growth areas is probably a more explicit use of user behaviour as ranking signals. As discussed, we’ve seen recently that clickstream data is being used by search engines to determine rankings.

Google has been playing around with this model for a while now with personalisation for account users.  Side wiki, SearchWiki, etc. This will be perfected more in the next year (including Google’s most recent addition of the blog site from SERPs extension for Chrome).

As user data becomes more prevalent. It will likely become harder to game the system with tactics such as link building. 

And I can’t imagine when the software will be created to mimic human onsite behaviour.

  1. Manual Quality Checks

Bing freely admits it has a team of real people who manually intervene in the search engine results. Unfortunately, Google also has a couple of shadier quality engineers. 

There seem to be several rankings that you get dropped to if you’re caught doing something even a bit naughty.  This will almost certainly be something that will grow as the internet develops and increases in size.

“The Changing Face of SERP Signals; Past, Present and Future”

If you want to learn about SEO, give us a call. We would love to talk to you. Email, [email protected] phone tel:016170620012

Or fill in the contact form page

Leave Comment

Your email address will not be published. Required fields are marked *

//www.growtraffic.co.uk/wp-content/uploads/2018/09/Icon-Logo-07-e1557240398971.png

est. 2009

Sign Up For Our Newsletter