Google‘s Biggest Algorithm Updates of 2016 (And What They Mean for You)

Google is always tinkering with its algorithm, to the tune of 500-600 changes per year. Most of these updates are small and incremental, but every so often, Google will roll out a major update that fundamentally changes how it evaluates and ranks pages.

2016 saw a number of such "core" algorithm updates, along with a few more targeted changes. For SEO practitioners, keeping tabs on these shifts is key to understanding Google‘s current priorities and adapting strategies to stay ahead of the curve.

So what were the biggest Google algorithm updates of 2016, and what did they mean for marketers? Let‘s dive in and break down each of the major changes, along with some actionable takeaways.

Panda Becomes Part of the Core Algorithm (January 11)

After 28 official updates over 5 years, Google‘s Panda algorithm was incorporated into the core ranking algorithm in January 2016. Panda, which first launched in 2011, targets low-quality and thin content.

With this move, Panda went from a periodic filter that was applied to the index every few months to a real-time signal that‘s constantly evaluating pages. That means Panda is now constantly at work devaluing pages with low-quality content.

For marketers, the implications were clear: removing or improving thin, duplicate, and low-value pages should be an ongoing priority. A page-by-page content audit is a good way to identify candidates for pruning or improvement.

Tools like Siteliner can help automate the process of finding thin or duplicate content. The goal should be ensuring every indexable page has a purpose and unique value to offer users.

AdWords Shakeup: 4 Ads on Top, None on Sidebar (February 19)

In a significant change to AdWords, Google eliminated sidebar ads entirely and added a fourth ad slot above organic results for "highly commercial queries."

The impact for PPC advertisers was greater competition and higher CPC‘s for the top 4 spots. According to an Adobe study, CPCs for the top 4 ad spots increased 25-30% for retail and 50-55% for travel compared to sidebar ads.

For organic search marketers, this update meant contending with an additional ad pushing organic results further down the page. Organic CTR declined an average of 3-5% for commercial searches with a 4-pack of ads, per the Adobe study.

To mitigate the impact, SEOs had to adjust their strategies, such as:

  • Targeting longer-tail keywords less likely to trigger a 4-pack of ads
  • Optimizing title tags and meta descriptions to improve click-through rates
  • Answering more question-based queries to earn featured snippets above ads

This update didn‘t impact rankings directly but forced SEOs to get more strategic to maintain organic visibility and traffic.

Mobile-Friendly Boost (May 12)

More than a year after the original "Mobilegeddon" update, Google announced it would be "increasing the effect" of mobile-friendliness as a ranking signal.

Google had already been tagging sites as mobile-friendly and using that as a ranking factor since 2015. But this update cranked up the volume, further widening the gap between mobile-optimized sites and non-optimized ones.

Post-boost, a BrightEdge study of 1,000 e-commerce keywords found mobile-friendly sites occupied 75% of the top 3 positions. The message was clear: being mobile-friendly was now table stakes.

Key factors for mobile-friendliness include:

  • Responsive design that adapts to screen size
  • Readable text without zooming
  • Tap targets sized appropriately for mobile
  • No Flash or pop-ups

Google‘s mobile-friendly test tool can diagnose a page‘s mobile friendliness and suggest areas for improvement. Marketers can also check Google Search Console for mobile usability error reports.

The mobile-friendly boost turned up the heat on sites dragging their feet on mobile optimization. With mobile searches now accounting for over 50% of total search volume, marketers ignore mobile user experience at their peril.

Quality Update aka "Phantom 2" (June 1)

In early June, many websites saw significant spikes or drops in organic traffic seemingly out of nowhere. Webmasters speculated it was due to a major Google update, which Search Engine Land dubbed "Phantom 2" (in reference to a previous "phantom" update in 2015).

Google eventually confirmed a "quality" update did occur around this time, but didn‘t share specifics on the signals involved. However, anecdotal evidence suggests the update favored sites with strong expertise, authority, and trust (E-A-T) signals, while devaluing those with shallow or poorly-sourced content.

Glen Gabe shared an example of a site that saw a 50%+ drop in organic traffic post-Phantom 2:

[SEMRush chart showing steep traffic decline]

On analyzing the site, Gabe found issues like thin, irrelevant content and lack of E-A-T signals like author bios and credible sources.

Conversely, Marcus Tober noted that "category leading" websites tended to see gains from the update:

"Phantom is another quality update that rewards quality pages and punishes ‘low quality.‘ What makes it a ‘Quality Update‘ is that the largest winners were high quality content publishers."

So what defines a "high quality" page in Google‘s eyes? The search giant‘s own Quality Rater Guidelines emphasize E-A-T signals as a key differentiator between high and low quality pages.

Factors that can demonstrate E-A-T include:

  • Author bios with relevant expertise and credentials
  • Citations and references to credible sources
  • Comprehensive, well-researched content
  • Positive reviews and testimonials

While there‘s no "E-A-T score", focusing on these trust signals can safeguard your site against quality-based algorithm hits.

The Phantom 2 update underscored Google‘s emphasis on serving users the highest quality, most trustworthy results. For marketers, it was a wake up call to invest in content from true subject matter experts and build site authority through credible sources and citations.

Penguin 4.0 (September 23)

After nearly two years of waiting, Google finally released a long-awaited Penguin update in September 2016. Penguin, first launched in 2012, targets spammy and unnatural linking.

The headlining change in Penguin 4.0 was that Penguin now runs in real-time as part of Google‘s core algorithm. Previously, Penguin was a periodic filter that was applied to the index separately and could take months to update.

Now, Penguin evaluates links in real-time as Google crawls the web. That means both positive and negative link signals are picked up much faster, potentially impacting rankings more quickly.

However, Penguin 4.0 also ushered in a more granular approach: rather than devaluing entire sites, it now devalues spam on a page-by-page basis. That‘s good news for webmasters, as one bad page is less likely to bring down rankings sitewide.

In practice, Penguin 4.0 made having a clean, natural backlink profile more important than ever. A single unnatural link is unlikely to trigger a penalty, but a pattern of spammy links pointing to a page can get it devalued in short order.

Some tips for staying on the right side of Penguin 4.0:

  • Audit your backlink profile using tools like Ahrefs or Majestic
  • Disavow unnatural links you can‘t get removed
  • Focus on earning links naturally through quality content vs. buying links
  • Monitor your link profile regularly for new, potentially spammy links

Penguin 4.0 wasn‘t a massive change, but it reaffirmed the need for marketers to be vigilant about their link profiles. The faster feedback loop increases the incentive to proactively prune bad links before they can impact rankings.

Possum (September 1)

Dubbed "Possum" by the local search community, this unconfirmed but widely observed update shook up local pack and finder results in September.

The update appeared to tighten Google‘s local filters to weed out spammy and irrelevant results. The biggest change was increased filtering of similar results from the same domain or physical location.

For example, pre-Possum, a search for "Denver dermatologist" might return multiple results from the same practice with slight variations in name or address. Post-Possum, Google is more likely to filter those "duplicate" listings and return only the most relevant one.

The impact was significant – a study by Joy Hawkins found that Possum affected 64% of local SERPs. Businesses that relied on spam tactics to dominate local packs saw rankings drop as Google filtered their listings as "duplicates."

On the flip side, businesses physically located outside the city saw a dramatic boost in rankings for searches with the city name included. Almost overnight, the local playing field was leveled.

Other key impacts of Possum:

  • Addresses and affiliation became more important ranking factors
  • The physical location of the searcher influenced results more heavily
  • Keyword variations (e.g. plural vs. singular) produced different results

For local marketers, Possum made having a consistent, non-spammy citation profile more important than ever. It also increased the emphasis on earning positive Google reviews and optimizing Google My Business listings.

Some post-Possum optimization tips:

  • Clean up your citations to ensure NAP consistency across the web
  • Add location-specific content and keywords to location pages
  • Earn reviews from customers in your target markets
  • Optimize GMB listings with categories, photos, hours, etc.

Possum was a much-needed clean up of local results that should benefit both users and businesses playing by the rules. The businesses that will win are those who focus on providing real value and earning legitimate local signals.

Looking Ahead: What‘s Next for Google‘s Algorithm?

2016 brought significant changes to Google‘s core algorithm, as well as more targeted updates in areas like local and mobile search. While the details varied, the overarching theme was consistent: Google is getting better and better at identifying and rewarding the highest quality, most relevant results.

Sites with thin content, unnatural links, and poor user experience are finding it harder to "game" the algorithm. Meanwhile, sites with strong E-A-T signals, mobile optimization, and clean link profiles are winning out.

Google isn‘t slowing down in its quest to deliver the best possible search experience. Future algorithm updates will likely continue to prioritize factors like:

  • Mobile-first indexing and design
  • Secure browsing (HTTPS everywhere)
  • Page speed and performance
  • Voice search optimization
  • Machine learning and AI

As always, the best way to future-proof your site is to focus on providing real value to users. Create comprehensive, well-researched content that demonstrates your expertise. Design fast, intuitive mobile experiences. Earn links and mentions naturally by being the best answer for your target audience.

Google‘s ultimate goal is to connect searchers with the best information to solve their query. By making that your goal as well, you‘ll be aligned with Google‘s mission and positioned to succeed in any future algorithm updates.