rankbrain

Is Content Finally Taking Its Revenge On Link Building?


rankbrain

When it comes to search engine optimization, the standard has basically always been that links reign supreme. Content has often been considered less important and has been given a lower priority. With RankBrain and the integration of contextual search results, content has become one of the most important ranking factors in Google. This is also what we will be dealing with in this article.

Google’s Semantic Understanding Has Increased Dramatically

In many ways, Google is dependent on links when it comes to expanding their search engine. Lately, the focus has shifted towards text and content and how such things can be used to relieve the link system from giving relevant and accurate search results on Google users search phrases and questions. With the introduction of Hummingbird, the improvements of Latent Semantic Indexing and more recently RankBrain, Google has taken huge steps towards turning their search engine into a machine which could theoretically differentiate content and unclear search phrases and provide specific search results without having to rely on links or specific keywords.

Content – The Extracurricular Nerd

Within the SEO community, links and content have always been considered to be fundamental when it comes to a site’s ability to rank on Google. This has also turned content into the scapegoat of Internet marketing. Many content evangelists found themselves getting attacked by those of a different persuasion. The veterans of SEO have always known that links are the single most important factor – even though the proponents of content had declared content as king, Darth Vader or a zombie Ayn Rand and left it at that.

The more obscure SEO guys, or should I say the tiny part of the SEO community that understood that links trump content when it comes to Google priority, were focused on creating as much, and as cheap, content as possible, more often than not, with the help of Asian freelancers, Google/Bing Translate and spun (rewritten) Wikipedia content. It was an easy concept – amount and scalability equalled better.

Of course, not all of those involved in search engine optimization were using these methods. Also, counting a few years back – or at least since the so called Panda update – content has been treated in a much more respectful manner.

The problem has never revolved around an unwillingness to produce text/content, but content has, web traffic wise, simply not had much of an effect. If the purpose is to increase traffic, why choose a method that’s much less affordable?

Also, those involved in search engine optimization are often cynical individuals. They have all, at least once or twice, been fooled by the “content is king”-crowd, failed and been forced to find other ways. They know that carefully evaluating a site’s in-links can decide its credibility. To mathematically do the same thing with text/content is a whole different game.

In the olden days – by that I mean the end of the 90’s and the early days of Google – there was only content but no way of evaluating its quality. It didn’t matter how many apples the content-crowd put on the teacher’s desk. The early search engines were simply engines, or machines. They had no way of evaluating content other than using mathematics, which isn’t very reliable when dealing with the value and difference of two texts being either the Iliad or a dime novel.

In 2003, when Google bought the unknown Santa Monica based company Applied Semantics, which was a stroke of genius. The company in question dealt with software applications for online marketing. This was a huge step for Google towards making their search engine understand contextual information and semantics – giving the content-crowd a realistic chance against the link-people.

READ ALSO  What I ask myself before I decide to quit

Definition Of Semantic Search

So, what’s semantic search? Basically, semantic search has to do with understanding your audience, their intentions and their needs and what kind of questions they might ask. Then you adjust your content towards answering these questions and meeting those needs, by offering contextual content which might not answer the specific search terms, but still cover the topic.

Search words, or keywords, are of less importance here. Following the Hummingbird Update (which improved Google’s semantic ability) in 2013, Google has rolled out several new components for their search algorithm giving it the ability to evaluate things such as:

  • Context
  • Relevancy
  • Place
  • Synonyms
  • Related phrases
  • Word classes and variations
  • Homonyms (words which sound alike and are spelled alike but mean different things)
  • Gender
  • Singular and plural
  • Current trends

… and several other things that matter, which dampens the effect of specific keywords not being used in the text.

The search engine has gone from focusing on keywords to dealing with the importance of concepts – in certain cases you won’t even have to include a keyword in your search phrase, it will be enough to enter what concept you are looking for in order to have your question answered.

How did they achieve this? Hummingbird introduced “true” LSI, Latent Semantic Indexing, which is a method for processing and analyzing large amount of data in order to find patterns.

In this case, Google increases its vocabulary by processing text: the search engine analyzes the text and its structure, which words and terms that are being used in relation to each other, abbreviations, and so on. That way the engine increases its ability to understand semantic context.

Before Hummingbird, LSI had been basic and unreliable. Following the update, LSI had become much quicker and better at catching context and semantics.

2013 was a while ago, and you, the reader, might already have noticed an increased reliability in Google’s ability to match difficult phrases and search result. How about the classic search “fly sheet or newsletter”?

clip_image002

Instead of wading through information being input manually by engineers, the search engine picks up searches, semantically similar words, terms and similar data by its own by scanning through texts and documents that are available on the Internet.

This is important because before Hummingbird, one could only guess the topic of a text or a site by interpreting which keywords that were being used within the text itself. Keywords are still important, but they are becoming less and less so. Google has for a long time emphasized that they prefer “conversational search” and the progress of semantics has really helped the company to reach this goal in a very short time. So, the time is now for us SEO pros to begin focusing on answering questions instead of providing keywords. How do I deal with content marketing? What is search engine optimization? And so on.

Co-Occurrence And Co-Citation

The two terms have to do with Google’s semantic understanding, and which aren’t often discussed, are co-occurrence and co-citation. What these two terms actually mean is discussed more thoroughly in this article.

You could say that both functions work towards creating a kind of natural, text based, linking between one or several independent sources. When a source refers to another source it creates a relation between the two sources, and so they become linked through citation which Google understands and takes into consideration.

It means that it is an important and useful practice to mention and link to authoritative sites, personalities and so on, that deal with the same topic as you do. Through your association with these sources, you can make your site more visible in the search box through co-citation.

READ ALSO  Top 11 Google Search Algorithm Updates of 2017

What Is Co-Occurrence?

Co-occurrence happens when two (or more) searches involved words that are so similar that people consider the synonymous, and especially if they are used closely together or in a wider context such as in a text.

For example, you might search for a phrase such as “Volvo Craigslist” and then try “used Volvo for sale”. These two search terms are so similar in language and purpose/intent that Google can safely assume that people want a used Volvo car when they are looking for a Volvo car on Craigslist.

Co-occurrence is dependent on how people behave, and how probable it is that a large amount of users will perform identical searches, which Google then uses to validate that the algorithm is working. This strengthens the bond between search terms, even when used in a different context and in combination with other search terms.

What Is Co-Citation?

Co-citation is what’s usually called link building without involving links. There are basically two ways for two independent sites to be linked to each other in a so called co-citation relationship:

  • They should mention a site, but not link to it
  • The sites deal with a topic that’s either similar or applicable to both sites. This doesn’t require them using the same keyword. The keywords don’t need to be similar or identical. This also applies to co-occurrence, which we discussed previously.

The purpose of co-citation becomes to send a signal to Google’s search algorithm that the mentioned site is relevant and believable, a sort of assurance in the shape of the text and not in the shape of a link. clip_image004

Summary

The fact that content isn’t prioritized is actually somewhat of a problem for those of us working within the field of search engine optimization. After all, content should be important. Not the number of links your site has.

During the years, we’ve seen it lots of times – e-retailers literally working 70 hour work weeks trying to produce good content WITHOUT it being worth the effort. There haven’t been any tools in place to reward good content. But with the constant evolution towards giving search engines a human like understanding, things are starting to chance. Things have seriously begun to change.

We’re slowly moving towards what could be called a “Web v3” and semantic search is a big part of this process. All signs point towards putting your faith into a future where the Internet is based on content and natural well written text, where you won’t have to rely on specific keywords, precise anchor text and links in order to be visible to search queries.

clip_image006

Currently, it’s possible to keep yourself afloat without using these strategies for linking. But link building and traditional SEO are still important and have to be integrated with each and every serious Internet venture. What will most likely happen is that even small sites with – and I can’t stress this enough – the right content will be able to survive.

Google, due to Rankbrain, now has a much better understanding of intent and context. This will help sites become more visible in higher ranking positions. In the future, although we aren’t there yet, seems to be brightly shining on content marketing and quality content.

Hand-Picked Related Articles:

* Adapted lead image: Public Domain Dedication (CC0) Public Domain, pixabay.com via getstencil.com



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com