Google is now relying on behavioral data that it gets from searchers, Chrome users, Android, etc. as the primary way that it ranks pages, according to Moz founder Rand Fishkin, who spoke at the recent MozCon event.
Google used to have to predict what searchers are going to do and used a reasonable surfer model as the premise for its search algorithm. No more says Rand Fishkin. Google knows everything so it doesn’t have to predict because it already knows. Interesting but very scary stuff.
I want to from my mind of all the things that I knew about link building up until this point and instead, take a look at companies and brands and websites and just ask what did they do right and what did we do wrong in the past and what is Google going to do about judging links?
You might remember that last year when Google announced RankBrain they said it is the third most important ranking factor. You might also recall Danny Sullivan asking them what are the first two? A few of us were on a phone call with one of Google’s engineers and brought this up and he was like, what are you talking about, everyone knows the first two are content and links. Still true.
In the past, link evaluation algorithms have been in these places we’re all familiar with such as PageRank, source diversity, anchor text, trust distance, domain authority, location on the page, spam out link analysis, yadda yadda yadda. All these little individual factors around how Google judges a particular link and all the links that point to a website.
But this is not where they’re going. Google’s is going away from the reasonable surfer model. Remember what PageRank was supposed to do, even in 1998, it was supposed to predict which links on a page were important and then it was supposed to assign values to them and it was supposed to assign those based on the probability, the chance of someone clicking on those links.
Of course, Google was very naive in 1998 and so all they could do was assign the same weight to all the links on a page and they assigned the weight of a page based on all the links that pointed to it.
But that is not today. Today, thanks to Chrome and Android and Google WiFi and Google Fiber, Google knows everything. Google’s sample of everything that happens on the web is probably in the 80 or 90 percentile range. It’s insane and it’s crazy. Because of that, they can see. Google knows where people were, where they go and where they go next. You don’t need a reasonable surfer model anymore. You don’t need to predict because you know.
Google’s goal is pretty clear, it’s searcher satisfaction. Google knows that if they satisfy searchers well, those searchers will return again and again. The number of searches will go up and the number of searches per searcher will keep going up and that’s what we’ve been seeing. Even as desktop has leveled off in its growth, mobile keeps growing and searches and searches per searcher keep growing.
Google’s core search team asks the same question every time, are searchers satisfied with the results? The way they know that is finding out if searchers are getting the answers that they’re seeking. Google asks how do we get to that? It’s behavioral data.