A Comprehensive History of Google Updates (and a Look to the Future)



A Comprehensive History of Google Updates (and a Look to the Future)

Published by Jayson DeMers | 1 Comment

A Comprehensive History of Google Updates (and a Look to the Future)

In the world of search engine optimization (SEO). Few topics have generated as much collective attention as the development and emergence of Google algorithm updates. These coding additions to Google’s search ranking algorithm, varying in size and scope, have the capacity to fundamentally change the way the algorithm works—and they have, periodically, over the years.

Whenever a new update is released, SEO professionals go crazy, excited to dig deep and learn what changed. And between updates, we tend to wait and speculate about what changes could be coming down the pipeline in the future.

If you’re new to the SEO world, this can all seem very intimidating. Google has been releasing updates for the better part of two decades, which is a lot to catch up on—and there are no signs of its momentum slowing down in the near future. Google is a company committed to ongoing change and development, and its algorithm is a reflection of that; as search marketers, we need to be willing to change and develop alongside it.

To help educate newcomers to the SEO world, provide a refresher for those of us in the thick of it, and lay a foundation with which we can predict the future of Google developments, I’ve put together this comprehensive history of Google updates for your reading or perusing pleasure.

Table of Contents

+ Basics About Google Updates
+ The Early Era
+ Panda and Penguin
+ Smaller Updates
+ The Modern Era

Basics About Google Updates

First, I want to cover some basics about Google updates. Because of the increased attention they get from SEO professionals as well as webmasters, marketers, and entrepreneurs in general, there have been a few misconceptions that have developed over time.

How updates work

You probably understand the idea behind a Google update well from a conceptual perspective. You’re definitely familiar with Google search in general; its primary function is to give users a list of the best, most relevant results for their search queries—and, unless you’re a hardcore Bing user, you’d probably agree it’s doing a pretty good job.

But it wasn’t always as impressive of a system as it is today. Search results are generally calculated based on two broad categories of factors: relevance and authority.

The relevance of a result is determined by evaluating how well a site’s content matches a user’s intention; back in the day, this relied on a strict one-to-one keyword matching system that hunted for web pages that used a specific keyword term more than its contemporaries. The authority of a site is determined by PageRank, a system that looks to sites’ backlink profiles to determine how they relate to other websites and authorities.

These two systems are updated frequently, as Google discovers new, more sophisticated, and less exploitable ways to evaluate relevance and authority. It also finds new ways of presenting information in its search engine results pages (SERPs), and adds new features to make searchers’ lives easier.

When Google decides to create an update, depending on the size, it may be pushed directly to the main algorithm or be published as a kind of test branch, to be evaluated for effectiveness and functionality.

Either way, the update may be improved upon with subsequent iterations over time. Sometimes, Google announces these changes in advance, letting webmasters know what they can expect from the update, but most of the time, they roll out silently, and we only know of their existence because of the changes we see and measure in SERPs.

Updates matter because they affect how search works in general, including how your site is displayed and how users interact with it.

How updates affect SEO

The common perception is that updates are bad news for SEO. They’re seen to be bad for the SEO business, throwing a wrench in optimizers’ best-laid plans to earn higher ranks for their respective sites and causing mass panic when they come out and crush everyone’s hard-earned rankings.

How updates affect SEO

(Image Source: SearchEngineLand)

There’s some truth to this; people do tend to panic when a new Google update causes a significant changeup in how rankings are displayed. However, updates aren’t simply a way for Google to step in and harsh everyone’s SEO buzz; they’re complicated pieces of algorithmic machinery designed to do a better job of providing quality information to Google’s users.

Accordingly, they do have a massive effect on SEO, but it isn’t all bad.

  • Ranking volatility. First, there’s almost always some ranking volatility when a new update is released, but for some reason, this has become associated with ranking drops and penalties. Yes, “penalties” can emerge from Google updates as Google decreases the authority of sites that partake in certain actions it deems to be low-quality (like illegal activities), but if sites only got penalized in ranking, what would happen to the SERPs? The reality is, some sites go down in ranking and some go up as they’re rewarded for partaking in higher quality strategies and offers. Overall, this volatility tends to be modest; it’s not as severe as most people make it out to be.
  • Influence on best practices. Google updates also help search optimizers reevaluate and redefine their best practices for online visibility success. When Google pushes a new update, it’s usually to improve some element of the online user experience, such as when it improves its ability to evaluate “good” content. When this happens, search marketers can learn from the update and improve their own provision of content to users. It’s true that some optimizers are perpetually ahead of the curve, and not all algorithm updates directly affect webmasters, but as a general rule, updates are a means of progression in the industry.
  • New features to play with. Some Google updates don’t interfere with evaluations of authority or relevance; instead, they add entirely new functions to Google Search. For example, the Google Knowledge Graph emerged as an entirely new way to provide users with information on their given topics. Now, we’ve all gotten used to the idea of getting instant information on whatever questions we ask or whatever topics we choose. These new additions also represent new opportunities to gain search visibility; for example, it’s possible for webmasters to work on getting their brands featured in more Knowledge Graph entries as opposed to purely organic search results.
  • Reinforcing old ideas. Oftentimes, Google updates won’t release anything particularly new, but instead exist as a reinforcement of an old idea that was pushed out previously. For example, Google will occasionally roll out new features that evaluate the quality of content on the web; this doesn’t change the fact that content quality is a massive factor for determining authority, but it refines how these figures are calculated. Accordingly, many Google updates don’t provide new information to search optimizers, but instead, give them greater positive reinforcement that the strategies they’ve adopted are on the right track.

Why updates are often ambiguous

One of the more frustrating elements of Google updates is their stunning level of ambiguity, and it manifests in a number of different forms:

  • Announcement ambiguity. Google will occasionally announce its updates in advance. For example, it released information on its so-called “Mobilegeddon” update well in advance of the actual rollout to give webmasters time to update their sites for mobile devices. However, it’s more common that Google rolls out its updates in silence, committing the change to its algorithm without telling anyone about it.
  • Effects on search ranks. Even when search optimizers notice the volatility in ranking and start to investigate, Google is notoriously tight-lipped about the true nature and extent of the update. When the company deviates from the norm and announces updates in advance, it usually describes the nature of the update generally, such as stating that the update is meant to improve the evaluation of content quality. It generally does not go into specific details about the coding or the real functionality of the update, even when asked by professionals seeking further information.
  • Beginnings and ends. Updates don’t always roll out as quickly as a light switch being flipped. Instead, they tend to occur gradually; for example, some updates “roll out” over the course of a weekend before their full effects on search rankings are seen. In other cases, Google may push one core update, then add onto it with new iterations, modifications, and versions later on.

If Google releases these updates to improve the web, why does the company intentionally withhold details like these from webmasters? The answer is pretty simple. The biggest reason for Google releasing updates in the first place is to improve overall user experience.

Imagine Google releases the exact details for how it ranks sites; webmasters would strive to exploit this information for the sole purpose of gaining rank, compromising that user experience. Cloaking this information in ambiguity is a defensive measure to prevent this level of manipulation from taking place.

Patterns and commonalities

Because Google updates are so ambiguous, one of the best tools we have as search optimizers is our ability to detect patterns and commonalities in these updates. Not only does this give us a better understanding of Google’s motivation and a clearer knowledge of the full scope of an update’s effects, but it also allows us to make more meaningful predictions about what updates may be coming in the future.

  • Annual patterns. Generally, Google will release several updates in the span of a year, but there’s typically been at least one “big” update per year; this wasn’t the case in 2015, and before 2011, most updates were smaller, monthly changes, so it’s difficult to nail this pattern down definitively. Bigger updates, like Panda and Penguin, were the subject of annual revisits for a time, with 2.0 versions coming out almost exactly one year later, but Google has recently changed this format of release to something more gradual. Oftentimes, search marketers will predict the timing for Google’s upcoming releases for a given year based on previous information.
  • Batch updates. Google also has a tendency to “batch” updates together, especially when they’re small. If it’s making a major change to its core algorithm, this isn’t possible, but if it’s making a number of smaller tweaks, it typically releases them all at once as a pack. These rarely come with a warning or overall description.
  • Testing specific improvements. Specific improvements, like changes to the layout of an SERPs, tend to be released as a test before rolling out to the broader community. Users of a specific demographic, or in a specific area, might report seeing the change before it’s committed on a full national scale. Then, some months later, Google will likely commit the change with various improvements as the finalized version.
  • Iterations and reiterations. Most of the minor updates speak for themselves, but for bigger updates, Google needs to use building blocks to complete its construction. Giant, game-changing updates like Panda and Penguin, are the subject of revisions, adjustments, data refreshes, and even combinations, such as when Panda was incorporated into Google’s core ranking algorithm.

Naming conventions

I’ve already made reference to a handful of different named Google updates, so I wanted to take a minute to talk about these update naming conventions. Some updates are formally named by Google; for example, the major algorithms Panda and Penguin were given formal, official names so they could be easily understood and discussed by members of the community. However, since Google is quiet about most of its rollouts, only a handful of updates get official names.

Instead, it’s the collective power of a community that usually lands on a name. In the early days of Google updates, the community of Web Master World took charge of naming updates that were otherwise going informally and silently released. Sometimes, these names were based on basic descriptors, such as the name of the city where the update was announced, and other times, they took on human names much like hurricanes.

Today, most update names emerge from the SEO community as leaders attempt to either describe what’s going on (such as with the suggestively titled Mobilegeddon update) or adhere to Google’s habit of arbitrarily naming updates after animals that start with “P” (such as with Pigeon, Panda, and Penguin). Sequential updates are typically kept in numerical order, as you might imagine.

As we enter the main section of this article, where I detail each of Google’s significant updates, I want to warn you that these updates are grouped contextually. For the most part, they’ll be kept in chronological order, but there may be deviations based on each update’s respective category.

The Early Era

Now it’s time to dig into the actual updates that have shaped Google into the powerhouse universal search algorithm it is today.

Pre-2003 updates

Even though Google officially launched in 1998, it wasn’t until 2003 that it started making significant updates to its search process (or at least, held enough attention in the online marketing community for people to care about them). Before 2003, there were a handful of changes, both to the ranking process and to the visual layout of the search engine, but things really kicked off with the Boston update in February of 2003.

  • Boston and the Google dance. The Boston update is the first named Google update, so called because it was announced at SES Boston. It’s unclear exactly what this update changed, but it did make some significant tweaks to Google’s ranking algorithm in addition to refreshing its index data. More importantly, Boston set a tone for the series of updates to come from Google, known as the “Google dance.” These Google dance updates were small yet significant monthly updates that changed any number of things in Google’s core algorithm, from how authority is calculated to major index refreshes.
  • Cassandra. Cassandra was one of the early monthly updates, rolling out in April of 2003. Its main purpose was to start fighting back against link spammers, who were exploiting Google’s system of ranking sites based on the number of backlinks pointing back to their domain. It also penalized sites with hidden text or hidden links, which were common occurrences back in 2003.
  • Dominic. Dominic is still a bit of a mystery. When it was released in May of 2003 as the latest addition of the Google dance, search optimizers seemed to notice its prominent effects immediately. Though it’s not clear the full extent of the changes, backlink counting and reporting seemed to undergo a massive overhaul.
  • Esmerelda. Esmerelda is one of the last of the Google dance updates, coming out in June of 2003. It was less significant than Cassandra or Dominic, but seemed to represent some major infrastructural changes at Google, as monthly updates were replaced with a more continuous flow.
  • Fritz. As Matt Cutts explains it, Fritz is the official last of the Google dance updates. Instead, Google began to strive for a more incremental, gradual updating approach.
  • The supplemental index. In September of 2003, Google made a change to produce a “supplemental” index that served as a branch, or peripheral unit serving the main index’s purpose and increasing overall efficiency by indexing a portion of the web.

Florida and tidying up

The 2003 Google dance era was the first major block of updates, but starting with the Florida update later that year, updates began to take on new roles and new significances for the online marketing community.

  • The Florida update. The Florida update was one of the first updates to really make an impact on the online marketing community. Even though updates up to this point had addressed common spam and low-quality problems, such as bad backlinks and keyword stuffing, Florida was the first major overhaul that put an end to these tactics for good. Thousands of websites plummeted in ranks for using these black-hat techniques, and webmasters were furious. Any investments in old-school SEO were now worthless, and suddenly people were beginning to fear and respect Google updates.
  • Austin. Austin was designed to be the clean-up hitter for Florida, coming out a short while later in January of 2004. It targeted spammy meta description practices and hidden on-page text, essentially missing the few things Florida missed with its clean sweep.
  • Brandy. Brandy was the name given to a cluster of updates that rolled out shortly thereafter in February of 2004. Rather than focusing on cleaning up spam, Brandy was mostly focused on improving the Google algorithm inherently. It added new semantic capabilities—including synonym detection and analysis—and started a transformation of Google’s old keyword practices, not to mention greatly expanding the primary index.
  • Allegra. It was a year later before another algorithm update rolled out (though there were some changes at Google in the meantime, including the addition of Nofollow links, which I’ll touch on shortly). In February of 2005, Allegra emerged to penalize new types of suspicious links.
  • Bourbon. The strangely named Bourbon update came out in May of 2005 to improve Google’s ability to evaluate content quality. There were a handful of changes here, but none that webmasters were able to specifically identify.
  • Gilligan. I questioned whether to include Gilligan here. In September of 2005, webmasters saw a spike in ranking volatility, which is usually an indication of an update emerging, but Google insisted that no update had occurred.
  • Jagger. Jagger was an iterative update launched between October and November of 2005. It targeted practitioners of link schemes like reciprocal links and paid backlink programs.

Nofollow and XML sitemaps

Aside from Google’s content evaluation and general improvement updates, this era was also privy to the development of several new features that search optimizers could utilize for greater content visibility.

  • Nofollow links. This wasn’t technically a Google update, but it was one of the more significant changes to the search engine landscape in general. It was a collaboration in January of 2005 between Google, Microsoft, and Yahoo that produced this new type of link—the nofollow link. Now, webmasters could include a tag in the back end of their code (“nofollow”) to indicate that a link shouldn’t be regarded for passing authority or PageRank. It’s a way to include a link for practical purposes (such as redirecting traffic or offering a citation) without formally citing a source for Google’s index. It helps brands protect themselves from accidentally building links that appear spammy, and generally improves the overall quality of links on the web.
  • Personalized search. It’s hard to imagine a time before personalized search features, but it wasn’t until June 2005 that it formally developed. Previously, Google relied on some specifically submitted forms to help personalize search, but with this update, the company could tap into your search history to provide you more accurate sources. Of course, the early version of this personalization feature was sketchy at best, but later iterations would refine and perfect the feature we all now take for granted.
  • XML Sitemaps. Sitemaps are constructs that help webmasters formally chart out all the pages and interconnections of their websites. Up until June of 2005 (the same time personalized search was rolling out), webmasters relied on their basic HTML sitemaps both to help users understand the layout of their sites and to help search engines index it properly. Though Google does a good job of eventually indexing every page and understanding the connections between them, it’s helpful and faster to upload an XML sitemap, a concept that Google created in June 2005. Now, creating, submitting, and regularly updating your XML sitemap is a staple of onsite optimization.

XML Sitemaps

(Image Source: Sitemaps.org)

  • Big Daddy. The hilariously named “Big Daddy” update was a bit more technical than the updates Google rolled out in the past. Rather than focusing on qualitative issues, like how to handle a user query or how to rank the value of a link, the Big Daddy update was focused on technical processes that happen behind the scenes. It improved Google’s canonicalization processes, the handling of 301 and 302 redirects, and a handful of other issues, rolling out between December of 2005 and March of 2006.

Maps, SERP changes, and canonical tags

It was around this time when Google started stepping up its game with even more features and functions for its users, going above and beyond the call of duty for search engines to give users a wider possible experience. This series of updates was focused on bringing new concepts to users, rather than improving existing infrastructure:

  • The dawn of modern maps. Google Maps had been around for a few years already, but the system hadn’t been very relevant to SEO. In October of 2005, Google decided to merge all its local Maps data with the business data it held in its Local Business Center. The resulting combination led to an almost-modern format of finding businesses on local maps, and helped to shape local SEO into a different form. This merge greatly increased the relevance of listing your business in Google’s Local Business Center, and forced more businesses to start thinking about their local SEO strategies.
  • Universal search. The “universal search” update was Google’s way of reimagining its SERPs for a new era. The biggest and most noticeable change was the introduction of a format that became known as “vertical search,” though this has become so standardized it’s hard to remember when Google functioned without it. Rather than offering only a search function that distributed web-based results, users could now toggle through several different “types” of results, including images, news, and maps. This opened up new doors of potential visibility for businesses everywhere, and forced webmasters to reconsider what they qualified as indexable content. It rolled out in May 2007.
  • Suggest. Believe it or not, Google Suggest didn’t come out until August of 2008. In a bid to improve the average user’s experience, Google wanted to offer more search possibilities to stem from a given query. As users type a query into the search box, Google would suggest a list of possible extensions and endings to the query, giving users branches of possibilities to explore. It was—and continues to be useful for users, but it also had two broad implications for search optimizers. First, it became an incredible tool for keyword research, powering software like Ubersuggest to help search optimizers find better and different keywords for their strategies. Second, it would come to power Google Instant, which I’ll describe in more detail later on.
  • The canonical tag. In February 2009, Google introduced the canonical tag, which was incredibly helpful in sorting out issues with duplicate content. It’s a relatively simple element to include in the back end of your site that informs Google which versions of a page should be indexed and which ones should be avoided. Without it, if your site has two versions of a single page (even unintentionally, as with http and https distinctions), it could register as duplicate content and interfere with your search visibility.

Further tweaking

Throughout this era, Google was also coming up with even more “tweaking” updates, all of which committed some minor changes to its search algorithm or improved user experience in some way. These are the most important ones to note:

  • Buffy. Google announced Buffy in June 2007, shortly after one of their engineers, Vanessa Fox departed the company. It was clear the update had some significant impact due to high volatility, but Google never disclosed what the update actually included. It was implied, however, that this was an accumulation of several other, smaller updates.
  • Dewey. Dewey was the next minor update on the radar, coming in at the end of March 2008 and stemming into April. Some professionals suspected this was Google’s way of pushing some of its own branded properties, but there was no hard evidence for this.
  • Vince. Vince was one of Google’s more controversial updates, taking place in February of 2009, nearly a year after Dewey. After rolling out, rankings shifted (like they always do), but this time, the shifts seemed to favor big brands and corporations. It’s hard enough for small- to mid-sized businesses to stay competitive in the search visibility world (or at least it was at that point), so many entrepreneurs and marketers were outraged by the changes—even though they were minor.
  • Real-time search. Though there was always a News section on Google, it wasn’t until the real-time search update at the end of 2009 that Google was able to index news stories and items from social media platforms in real time. After this update, new stories and posts would populate instantly in Google’s index, getting content to users faster.
  • Google Places. Though it didn’t have much of an impact on SERPs or company rankings, Google did overhaul its Local Business Center in April of 2010. It took Google Places, previously incorporated as a section of Google Maps, and created a specifically designated Google Places center for local businesses to create their profiles and advertise their establishments.
  • May day. The May day update was a niche change that seemingly only affected traffic generated by long-tail keyword queries. As explained by Matt Cutts, the update was intended to decrease the ranks of sites that used long, undetailed or low-quality content as a gimmick to rank higher for long-tail keywords.
  • Caffeine. Caffeine is one of the more substantial updates on this list, and it actually started rolling out as a preview in August 2009, but it didn’t roll out fully until the summer months of 2010. It was another infrastructural improvement, increasing the speed and efficiency of the algorithm rather than directly changing its ranking criteria. Evidently, the change increased indexation and crawling speeds by more than 50 percent, and increased search results speeds as well.
  • Instant. Google Suggest foreshadowed the release of Google Instant, which came out in September 2010. With Instant, users could sneak preview results as they typed in their queries, shortening the distance between a user’s query and full-fledged results.
  • Social signals. Corresponding with the rise in popularity of Facebook and other social media sites, Google added social signals as slight ranking signals for sites’ authorities. Today, this signal is weak, especially when compared to the power of backlinks.
  • Negative reviews. This update was a simple change at the end of December 2010 that addressed concerns that negative reviews could lead to an increase in rank due to increased visibility.
  • Attribution fixes. As a kind of prequel to Panda (which I’ll cover in-depth in the next section), Google released an attribution update in January of 2011. The goal was to stop content scrapers and reduce the amount of spam content on the web.

Panda and Penguin

So far, we’ve seen some significant updates, revisions, changes, and additions to Google Search, but now we’re getting to the heavy hitters. Two of the most impactful, talked-about, and still-relevant updates to the Google ranking algorithm happened practically back-to-back in 2011 and 2012, and both of them forever changed how Google evaluates authority.

The Panda update

  • The basics. The simplest way to describe the Panda update is as an algorithmic change that increased Google’s ability to analyze content quality. Obviously, Google has a vested interest in giving users access to the best possible content available online, so it wants sites with “high quality” content to rank higher—but what exactly does “high quality” mean? Previous updates had attempted to solve this dilemma, but it wasn’t until Panda that Google was able to take this concept to the major leagues.

The update rolled out on February 23, 2011, and fundamentally changed the way search optimizers operated online. This update was announced formally and explained by Google; the company stated that the update impacted roughly 12 percent of all search queries, which is a huge number, compared to previous algorithm changes.

Digging into more specific details, the update targeted content farms, which previously existed to churn out meaningless “fluff” content to increase ranks. It also penalized sites with content that seemed to be unnaturally written, or stuffed with keywords, and instead rewarded sites that it evaluated to have detailed, well-written, valuable content.

Over the course of its months-long rollout, thousands of sites saw their rankings—and their traffic—plummet, and the backlash was intense.

Google Panda Update

(Image Source: Neil Patel)

  • Panda 2.0 and more. Everything I described in the preceding paragraph relates only to what is retroactively referred to as Panda 1.0. As a follow-up to this update, Google released another algorithmic change, known as Panda 2.0 to the SEO community, in April 2011. The update polished some of the additions that Panda 1.0 introduced, and added a handful of new signals for English queries.

Yet another follow-up, briefly known as Panda 3.0 until users realized it paled in significance to 1.0 and 2.0, became known as 2.1—these changes were hardly noticeable. A round of more Panda updates, minor in scale but significant in nature followed, including Panda 2.2 in June 2011, 2.3 in July, and so on each month until October.

  • Panda 3.0 and slowdown. There’s some manner of debate in what actually qualifies as Panda 3.0. For some professionals, 3.0 happened in October 2011, when Google announced a state of “flux” for the Panda update, which saw no fewer than three separate minor updates working together to affect about 2 percent of queries.

For others, 3.0 happened in November 2011, when an update on November 18th shook up rankings further, seemingly based on the quality of their sites’ content.

  • The Panda dance. After the November 2011 update, new iterations of the Panda update seemed to settle into a groove. Each month, there would be some kind of new update, affecting a small percentage of queries, for more than a year. Some of these updates would be full-fledged algorithm changes, adding, subtracting, or modifying some component of the core Panda algorithm.

Others would merely be data refreshes, “rebooting” Panda’s evaluation criteria. In any case, the number of Panda updates grew to be about 25 when Matt Cutts announced that in the future, monthly Panda updates would roll out over the course of 10 days or so, leading to a more seamless, rotating update experience for users and webmasters. This setup became known informally as the “Panda dance,” loosely referencing the Google dance of 2003.

  • Panda 4.0. After the monthly revolving door of the Panda dance, things stayed relatively quiet on the Panda front until May 2014, when an update that became known as the official Panda 4.0 was released.

It rolled out gradually, much like a Panda dance update, but was massive in scale, affecting about 7.5 percent of all queries. Evidence suggests it was both a fundamental algorithmic change and a data refresh, which lent power to the overall release.

Panda 4.1, a follow-up in September 2014, affected a hypothetical 3 to 5 percent of all search queries, but the typical slow rollout made it hard to tell. Panda 4.2 came in July 2015, but didn’t seem to have much of an impact.

  • Lasting impact. Today, the Panda update is still one of the most important and influential segments of Google’s ranking algorithm. Though it started as a separate branch algorithm, today it’s a core feature of Google search, and most search optimizers recognize its criteria as the basis for their on-site content strategies.

Presumably, it’s still being updated on a rolling, gradual basis, but it’s hard to take a pulse of this because of the constant, slow updates and the fact that each successive update seems to be less significant.

Schema.org and other small updates

Between and surrounding the Panda and Penguin updates were a number of other small updates, including the introduction of Schema.org microformatting, now a major institution in the search world.

  • org. Much in the same way they came together for nofollow links, Google, Microsoft, and Yahoo worked together to settle on a single standard for “structured data” on sites. This structured data is a format in the back end of a site that helps search engine bots decode and understand various types of data on the site. For example, there are specific types of Schema markups for various categories of information, such as people, places and things. Today, it’s used primarily to draw information from sites to be used in the Knowledge Graph.

Schema

(Image Source: Schema.org)

  • Google+. Though currently, the social media platform seems somewhat obsolete, when Google launched it back in June 2011, it seemed to be the future of search engine optimization (SEO). Here was a social media platform and content distribution center connected directly into Google’s own ranking algorithm; as a result, the platform saw 10 million signups in less than 2 weeks.

Unfortunately, it wasn’t able to sustain this momentum of growth and today, Google+ doesn’t bear any more impact on search ranks than any other social media platform.

  • Expanded sitelinks. In August 2011, Google changed the layout of certain page entries in SERPs to included what are known as “expanded sitelinks,” detailed links under the subheading of a master domain. These pages, often ones like About or Contact, could then be optimized for greater visibility.
  • Pagination fixes. Up until September 2011, pagination was a real problem for the Google ranking algorithm. Because there are multiple ways to paginate a series of pages on your site (such as when you have multiple pages of a blog), many duplicate content and canonicalization issues arose for webmasters. With this update, Google introduced new attributes specifically designed to better index multiple-page sections.
  • Freshness. The Freshness update came in November 2011, with Google proclaiming the update to affect 35 percent of all queries, though this number was never demonstrated. The goal of the update was to work in conjunction with Panda to prioritize “fresher,” newer content over old content.
  • The 10-packs. In a real departure from typical Google style, Matt Cutts announced the 10-pack of updates in November 2011 proactively and transparently. None of these updates were game-changers, but they did affect a number of different areas, including “official” pages getting a ranking boost and better use of structured data. Additional update “packs” followed in the coming months, but never revolutionized the search engine.
  • Ads above the fold. Though never named, a small update in January 2012 targeted sites that featured too many ads “above the fold”—what had become known in the industry as “top-heavy” sites. Instead, Google preferred sites with a more conservative ad strategy—or ones with no ads at all.
  • Venice. Venice was one of the first significant local updates, significantly increasing the relevance of local sites for corresponding local queries. It was a major boon for small businesses taking advantage of their local niches.

The Penguin update

  • The basics. What Panda did for content, Penguin did for link building. In the weeks leading up to the release of Penguin, Google warned webmasters that there would be serious action taken to fight back against “over optimization,” the general nomenclature for any series of strategies designed to optimize a site to rank higher at the expense of user experience.

Subsequently, Penguin was formally released on April 24, 2012. The update targeted a number of different activities that could be qualified as black-hat practices. For example, it cracked down on keyword stuffing, the practice of artificially including unnatural keywords into links and content for the sole purpose of increasing rankings for those terms.

It also seriously improved Google’s ability to detect “good” and “natural” backlinks versus unnatural or spammy ones. As a result, more than 3 percent of all search queries were affected, and webmasters went into the same outrage they did when Panda disrupted their ranking efforts.

Like Panda, it set a new tone for search optimization—one focused on quality over manipulation—and helped solidify Google as the leader in search it remains to be.

  • Initial boosts. Like Panda, Penguin needed some initial boosts to correct some of its early issues and sustain its momentum as a search influencer. The update that became known as Penguin 1.1 was released in May 2012, and served as more of a data refresh than an algorithmic modifier. However, it did confirm that Penguin was operating as an algorithm separate from Google’s core search ranking algorithm, much in the way that Panda did in its early stages.

A third Penguin update came in October 2012, though it didn’t seem to affect many queries (less than 1 percent).

  • Penguin 2.0. The next major Penguin update rolled out in May 2013, almost exactly one year after the original Penguin update was released. This one was officially numbered by Google, but it didn’t bear as much of an impact as some professionals were anticipating. Though details were limited about the scope and nature of the update, it was speculated that this update focused more on page-level factors than its 1.0 counterpart.
  • Penguin 3.0. Due to the pattern of releases in May, most search optimizers anticipated there to be a Penguin 3.0 release in May 2014. However, it wasn’t until October 2014 that we saw another refresh of the Penguin algorithm. Because the update was spread out over the course of multiple weeks, it’s hard to say exactly how much of an impact it had, but it looks as though it was less than 1 percent of all queries, making it the weak link in the Penguin bunch.
  • Penguin 4.0. In September 2016, Google announced yet another Penguin update, suggesting that Penguin is now a part of Google’s “core” algorithm. Over the next few weeks, Penguin 4.0 rolled out over several phases, devaluing bad links instead of penalizing sites and reversing some previous Penguin penalties. Since then, Penguin updates have been small refreshes, in the same vein as the Panda update.

Like Panda, Penguin still holds an esteemed reputation as one of the most significant branches of the Google search algorithm, and is responsible for the foundation of many SEO strategies.

The Knowledge Graph

The Knowledge Graph is a distinct section of indexed information within Google. Rather than indexing various sites and evaluating them for relevance to a given query, the Knowledge Graph exists to give users direct, succinct answers to their common questions. You’ve likely seen a box like this pop up when you Google something basic:

The Knowledge Graph

(Image Source: Google)

The Knowledge Graph was rolled out as part of an update back in May 2012. When it was released, it only affected a small percentage of queries, as the Knowledge Graph’s scope and breadth were both limited. However, Google has since greatly prioritized the value and prominence of the Knowledge Graph, and thanks to more websites utilizing structured markup, it has access to more information than ever before.

The next official Knowledge Graph expansion came in December 2012, when Google expanded the types of queries that could be answered with the KG and ported it to different major languages around the world. After that, an update in July 2013 radically increased the reach and prominence of the Knowledge Graph, raising the number of queries it showed up for by more than 50 percent. After this update, more than one-fourth of all searches featured some kind of Knowledge Graph display.

Since then, the prominence of rich answers and KG entries has been slowly and gradually increasing, likely due to the gradual nature of incoming knowledge and increasing capabilities of the algorithm. The Hummingbird update and its subsequent partner RankBrain (both of which I’ll explain in coming sections) have also amplified the reach and power of the Knowledge Graph with their semantic analysis capabilities.

Exact Match Domains update

The exact-match domains update was a seemingly small update that affected a ton of queries. Exact-match domains are ones that match the wording of a user’s query exactly. In some cases, this is highly valuable; a user is likely searching for that exact brand, but in some cases, this can be used as a deceptive way to poach organic traffic.

Google accordingly reevaluated the way it handled cases of exact-match domains, affecting nearly 1 percent of queries and reducing the presence of exact-match domains by more than 10 percent in search engine results pages.

Smaller Updates

After discussing the powerhouses of Panda and Penguin at length, all other search engine updates seem smaller by comparison. However, there have been some significant additions and modifications in recent years, some of which have opened doors to entirely new search visibility opportunities.

Payday Loan update

The Payday Loan update came around June 2013, and its purpose was to penalize sites with dubious intentions or spammy natures. As the name suggests, the primary target for these were “payday loan” sites and other similar sites that deliberately attempt to take advantage of consumers.

Porn sites were also targeted. The main scouting mechanism for this was evaluating certain types of link schemes in an effort to reduce spam overall. For most legitimate business owners, this update didn’t make much of a difference.

Payday loans also saw future iterations—2.0 and 3.0—in 2014, which targeted the same types of sites.

Hummingbird

Hummingbird was, and continues to be, a beautiful and particularly interesting update. Even though many users never noticed it, it fundamentally changed the way Google search works—and continues to influence it to this day. Released sometime around August 2013 and formally acknowledged later in September, the Hummingbird update was a core algorithm change that improved Google’s ability to evaluate queries, working on the “relevance” side of the equation rather than the “authority” side.

Specifically, the Hummingbird update changed the way Google looks at keywords in user queries. Rather than dissecting queries based on what individual keywords and phrases it contains, the Hummingbird update allows Google to semantically decipher what a user is intending to search for, then find results that serve that intention throughout the web. This may seem like a small difference—and for most queries, these types of results are similar—but now that it exists, heavily keyword-focused strategies have been weakened, and the power of naturally-written content with contextual relevance signals has increased even further.

Hummingbird has also been important in setting the stage for the future of Google developments, as we will soon see. For starters, it has fueled the rise in voice searches performed by users; voice searches tend to be more semantically complex than typed queries, and demand a greater semantic understanding for better results. Google has also modified Hummingbird with an important and game-changing update more recently with RankBrain.

Authorship drops

One of the biggest motivations to getting a Google+ account and using it to develop content was to take advantage of the concept of Authorship that developed. Authorship was a way for you to include your name and face (in other words, your personal brand) in Google’s search results, alongside all the material you’d authored.

For a time, it seemed like a huge SEO advantage; articles written with Authorship through Google+, no matter what site they were intended for, would display more prominently in search results and see higher click-through rates, and clearly Google was investing heavily in this feature.

Author Rank

(Image Source: Dan Stasiewski)

But starting in December 2013, Google started downplaying the feature. Authorship displays went down by more than 15 percent, with no explained motivation for the pull-back.

In June 2014, this update was followed up with an even more massive change—headshots and photos were removed entirely from search results based on Authorship. Then, on August 28, 2014, Authorship was completely removed as a concept from Google search.

In many ways, this was the death knell for Google+; despite an early boost, the platform was seeing declining numbers and stagnant engagement. Though search optimizers loved it, it failed to catch on commercially in a significant way.

Today, the platform still exists, but in a much different form, and it will never offer the same level of SEO benefits that it once promised.

Pigeon

Up until Pigeon rolled out in July 2014, Google hadn’t played much with local search. Local search operated on an algorithm separate from the national algorithm, and it had toyed with various integrations and features for local businesses, such as Maps and Google My Business, but the fundamentals remained more or less the same for years.

Pigeon introduced a number of different changes to the local algorithm. For starters, it brought local and national search closer together; after Pigeon, national authority signals, like those coming from high-authority backlinks, became more important to rank in local results.

The layout of local results changed (though they previously and have since gone through many different layout changes), and the way Google handled location cues (mostly from mobile devices and other GPS-enabled signals) improved drastically. Perhaps most importantly, Pigeon increased the power of third party review sites like Yelp, increasing the authority of local businesses with a great number of positive reviews and even increasing the rank of third party review site pages.

It was a rare move for Google, as it was seen as a partial response to complaints from Yelp and other providers that their local review pages weren’t getting enough visibility in search engines. In any case, Pigeon was a massive win for local business owners.

Ongoing Panda and Penguin refreshes

I previously discussed Panda & Penguin already in their respective section, but they’re worth calling attention to again. Panda and Penguin weren’t one-time updates that can be forgotten about; they’re ongoing features of Google’s core algorithm, so it’s important to keep them in the back of your mind.

Google is constantly refining how it evaluates and handles both content and links, and these elements remain two of the most important features of any SEO campaign. Though the updates are happening so gradually they’re hard to measure, the search landscape is changing.

Layout and functionality changes

Google has also introduced some significant layout and functionality changes; these haven’t necessarily changed the way Google’s search algorithm functions, but they have changed the average user’s experience with the platform:

  • In-depth articles. In August 2013, Google released an “in-depth” articles update, and even formally announced that it was doing so. Basically, in-depth articles were a new type of news-based search result that could appear for articles that covered a topic, as you might guess, in-depth. Long-form, evergreen content was given a boost in visibility here, driving a trend of favoring long-form content that continues to this day.
  • SSL. In August 2014, Google announced that it would now be giving a ranking boost to sites that featured a layer of SSL encryption (denoted with the HTTPS designation), clearly with the intention of improving the average user’s web experience with increased security features. This announcement was blown up to be “critical” for sites to upgrade to having SSL protection, but the actual ranking boost here turned out to be quite small.
  • The quality update. The “official” quality update happened in May 2015, and originally it was described as a “phantom update” because people couldn’t tell exactly what was happening (or why). Eventually, Google admitted that this update improved the “quality signals” that it received from sites as an indication of authority, though it never delved into specifics on what these signals actually were.
  • AdWords changes. There have also been several changes to the layout and functionality of ads over the years, though I’ve avoided talking about these too much because they don’t affect organic search that much. Mostly, these are tweaks in visibility and prominence, such as positioning and formatting, but a major shakeup in February 2016 led to some big changes in click-through rates for certain types of queries.

Mobilegeddon

Mobilegeddon is perhaps the most entertainingly named update on this list. In February 2015, Google announced that it would be making a change to its ranking algorithm, favoring sites that were considered to be “mobile friendly” and penalizing those that weren’t mobile friendly on April 21 of that same year. Although Google had slowly been favoring sites with better mobile user experience, this was taken to be the formal, structured update to cement Google’s desire for better mobile sites in search results.

The SEO community went crazy over this, exaggerating the reach and effects of the update as apocalyptic with their unofficial nickname (Mobilegeddon). In reality, most sites by this point were already mobile-friendly, and those that weren’t had a great deal of tools at their disposal to make their site more mobile-friendly, such as Google’s helpful and still-functional tool to literally test and analyze the mobile friendliness of your site.

Overall, “mobile-friendly” here mostly means that your site content is readable without zooming, all your content loads appropriately on all manner of devices, and all your buttons, links, and features are easily accessible with fingertips instead of a mouse. When the update rolled out, it didn’t seem very impressive; only a small percentage of queries were affected, but subsequent refreshes have given it a bigger impact.

Google also released a Mobile 2.0 update in May 2016, but since this basically reinforced concepts already introduced with Mobile 1.0, the effects of the update were barely noticeable to most business owners and search optimizers.

The Modern Era

Now, we enter the modern era of Google updates. I’ve already mentioned a handful of updates that have come out in the past couple years or so, but now I want to take a look at the game-changers that will hold the biggest impact on how the search engine is likely to develop in the future.

RankBrain

RankBrain made headlines when it first emerged—or rather, when it was formally announced by Google. Google announced the update in October 2015, but by that point, the process had already been rolling for months. What makes RankBrain special is the fact that it doesn’t necessarily deal with authority or relevance evaluations; instead, it’s an enhancement to the Hummingbird update, so it deals with better understanding the semantics of user queries.

But RankBrain is even more interesting than that; rather than being a change to Google’s algorithm or even an addition, it’s a machine learning system that will learn to update itself over time. Hummingbird works by trying to analyze user intent of complex phrases and queries, but not all of these are straightforward for automatic algorithms to analyze; for example, the phrase “who is the current president of the United States?” and the phrase “who’s the guy who’s running the country right now?” are basically asking the same thing, but the latter is much more vague.

RankBrain is designed to learn the complexities of language phrasing, eventually doing a better job at digging into what users are actually searching for. It’s highly significant because it’s the first time Google has used a machine learning update, and it could be an indication of where the company wants to take its search algorithm in the future.

Quality updates

Google has also released a number of what it calls “quality updates,” which make alterations to what factors and signals indicate what’s determined to be “quality content.” The quality update of May 2015 was one of these, but other updates have presumably rolled out, making less of an impact and going largely unnoticed by search optimizers.

However, Google has recently opened up more about what actually qualifies as “quality content,” publishing and regularly revising a document called the search quality evaluator guidelines. If you haven’t read it yet, I highly recommend you check it out. A lot of it is common-sense stuff, or is familiar if you know the basics of SEO, but there are some hidden gems that are worth scoping out for yourself; I highlighted 10 of them in this infographic.

With regard to spam, Google has released two new updates in the past two years to combat black-hat tactics and practices that negatively affect users’ experiences. In January 2017, Google released an update they announced five months previously, called the “intrusive interstitial” penalty. Basically, the update penalizes any site that uses aggressive interstitials or pop-ups to harm a user’s experience.

In addition, Google launched a soft update in March 2017 called “Fred,” though it’s still officially unconfirmed. The specifics are cloudy, but Fred was designed to hit sites with low-value content, or those practicing black-hat link building tactics, penalizing them.

Possum

In September 2016, just before Penguin 4.0, the search community noticed significant ranking volatility for local searches. Though unconfirmed by Google, the informally named “Possum” update attracted significant attention. It appears the update increased the importance of location for the actual searcher, and updated local search entries to include establishments that were just outside city limits. Google also seems to have experimented with location filtering, separating individual locations that point to the same site.

Snippet Changes

Google has also been experimenting with its presentation of “featured snippets,” the selected, standalone search entries that pop up to answer your questions concisely, above organic search results, but below ads. Though the past few years have seen a steady increase in the number of featured snippets, there was a significant drop in October 2017; conversely, Knowledge Graph panels increased, suggesting Google may be rebalancing its use of featured snippets and the Knowledge Graph as a means of presenting information to searchers.

Later, in November 2017, Google increased the length of search snippets for most results, upping the character limit to nearly twice its previous cap of 155.

Micro-updates

Google also seems to be favoring micro-updates, as opposed to the behemoths that made search optimizers fear for their jobs and domains for years. Instead of pushing a massive algorithm change over the course of a weekend, Google is making much smaller tweaks, and feeding them out over the course of weeks, or even months. For example, in April 2017, about half of all page-one organic search results were HTTPS as of mid-April, yet by the end of 2017, they represented 75 percent of all results, heavily implying a slow-rollout update favoring secure sites.

In fact, it’s hard to tell when Google is updating its algorithm at all anymore, unless it announces the change directly (which is still rare, thanks to Google wanting to cut back on spam). These updates are going unnamed, unnoticed, and for the most part, un-worried about. November 2016, December 2016, February 2017, May 2017, September 2017, and November 2017 all saw significant ranking volatility associated with small updates.

The core algorithm

Part of the reason why Google is cutting back on the massive updates and favoring much smaller ones is because its core algorithm is already doing such a good job on its own. It’s a high-quality foundation for the search engine, and it’s clearly doing a great job of giving users exactly what they’re looking for—just think of all the times you use Google search on a daily basis.

Especially now that Panda and Penguin are rolled into the core algorithm, Google’s not in a place to be making major infrastructural changes anymore, unless it someday decides to fundamentally change the way we think about search—and honestly, I wouldn’t put it past them.

App indexing and further developments

It is interesting to see how Google has increased its attention on app indexing and displays. Thanks to the greatly increased popularity and use of mobile devices, apps have become far more relevant and Google has responded.

Its first step was allowing for the indexation of apps, much like the indexation of websites, to allow apps to show up for relevant user queries. After that, it rolled out a process called app deep linking, which allows developers to link to specific pages of content within apps for users who already have the apps installed.

For example, if you have a travel app, a Google search on your mobile device could theoretically link you to a specific destination’s page within that app.

But Google’s pushing the envelope even further with a process now called app streaming. Now, certain apps are being stored on Google’s servers, so you can access app-based content in a Google search without even having the app installed on your mobile device. This could be a portent of the future of Google’s algorithm development.

Where does it go from here?

With the knowledge of the modern era, and the pattern of behavior we’ve seen from Google in the past, we can look to the future and try to predict how Google will develop itself.

  • The end of big-game updates? The first major takeaway from Google’s slowdown and propensity for smaller, more gradual updates is the fact that game-changing updates may have come to an end. We may never see another Panda or Penguin shake up rankings as heavily as those twin updates did. We may see search evolve, in radical new forms no less, but even these changes will happen so gradually, they’ll be barely noticeable. However, it’s still important to pay attention to new updates as they roll out so you can adapt with the times.
  • Ongoing Knowledge Graph expansion. Google is heavily favoring its Knowledge Graph, as rich answers are rising in prominence steadily, especially over the past few years. They’re getting more detailed, they’re showing up for more queries, and they’re taking up more space. It’s clear that Google wants to be able to answer every question immediately and specifically for users, but the implications this might have for overall search visibility and organic traffic have yet to be determined.
  • Machine learning updates. RankBrain is only one indicator, but it is a big one, and it only makes sense that Google would want more machine learning algorithms to develop its core search function. Machine learning takes less time, costs less labor, and may ultimately produce a better product. However, more machine learning updates would make search optimization less predictable and more volatile in terms of changes, so it’s definitely a mixed bag.
  • The eventual death of the traditional website. Thanks to Google’s push for more app-based features and app prominence, not to mention the rise of voice search and smart speakers, the death of the “traditional website” may be accelerated. It’s unlikely that websites will die out within the next couple of years, but beyond that, consumers could be ready for an overhaul to the average online experience.
  • New features and functions. Of course, Google will probably develop new features, new functions, and accommodations for new technologies in addition to simply improving its core product. These developments are somewhat unpredictable, as Google keeps them tightly under wraps, and it’s hard to predict what technologies are on the horizon until they’re actually here.

These predictions are speculative and ambiguous, but unfortunately that’s the nature of the beast. Historically, it’s been difficult to know what to prepare for, because not even Google engineers know exactly what technologies users will take to and which ones they won’t.

Everything in the search world—from algorithm updates to SEO and user experience design—demands an ongoing process of feedback and adaptation. You have to pay attention, remain flexible, and work actively if you want to stay ahead of the curve.

Google has come a long way in the past 19 years of its existence, and it likely has a long road of development ahead of it. If you attune yourself to its changes, you can take advantage of them, and ride those waves of visibility to benefit your own brand.

What can we help you with?


avatar

Jayson DeMers

Jayson DeMers is the Founder & CEO of AudienceBloom. You can contact him on LinkedIn, Google+, or Twitter.


Looking to grow your traffic?

Our managed SEO and social campaigns and high domain authority link building will increase your presence and organic search engine traffic.

Request a rate card


Want more great resources?

Check out our new Resource Library, with over 100 expert articles spanning all aspects of online marketing, divided into 16 chapters.

See our Resource Library




Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com