Brief History of Google’s Algorithm Updates

Google Algorithm Updates

Google is the most used search engine in the world. People use google search every day for their queries and required information. The most popular search engine has its own algorithm that serve against the keywords or key phrases. This algorithm is satisfying the internet users with a huge success rate. Google has taken several steps to make search result appropriate and accurate, so the users may come again on the platform.

Over the past decade google algorithm is updated to optimize the search results and user experience. Let us dive into these algorithms to understand what actually google is looking for in a content.

Panda Update February 24, 2011

Google’s Panda update emerged to deal with sites that were intentionally created to rank in the search engines. It mostly focused on the on-page factors. In other words, it resolute whether a website actually offered information about the search query visitors used. 

Most of the sites that provides affiliate back links or other spam back links were penalized. Sites with very poor and short content other were affected with the panda update as well. The algorithm was intended to battle against these spam SEO techniques

  • Duplicate
  • Plagiarized or thin content
  • User-generated spam
  • Keyword stuffing

This update was included in the Google’s core algorithm list in 2016 as a permanent resource to tackle spam sites.

Penguin Update April 24, 2012

The updated was intendedto check whether backlinks to a site were genuine, or if they’d been bought to manipulate the search engines. In the past, lots of sites paid for links as a shortcut to upscale their rankings. Google’s Penguin update intended to discourage buying, exchanging or artificially generated links. If it found artificial links, Google assigned a minusscore to the site concerned, rather than the positive link value it would have previously received. The Penguin update appeared several times since it first appeared and Google included it to the core algorithm in 2016.

Below are the main targets of penguin algorithm.

  • Spammy or irrelevant links
  • links with over-optimized anchor text

Penguin has revolutionized link building: it no longer serves to get low-effort, paid backlinks. Instead, you have to work hard on building a successful link building strategy to get relevant and authentic links from valued sources.

Hummingbird update August 22, 2013

The impression of the Hummingbird update wasn’t immediately clear, as it wasn’t directly envisioned to punish spam practice. In the end, it mostly enforced the assessment that SEO copy should be readable, use natural language, and shouldn’t be over-optimized for the same few keywords, but use synonyms instead.

Hummingbird assures for a page to rank for a search query even if it doesn’t contain the exact words the searcher entered. This is achieved with the help of natural language processing that relies on latent semantic indexing, co-occurring terms and synonyms.

The main targets of hummingbird update were

  • Penalize Keyword Stuffing
  • Rank Higher Quality Content

Pigeon Update July 24, 2014

The Pigeon update affected both the results pages and Google Maps. It commanded to more accurate localization, providing preference to results near the user’s location. It also designed to make local results more appropriate and higher quality, taking organic ranking factors into account. 

Pigeon affected those searches in which the user’s location plays the key part. The update provoked closer ties between the local algorithm and the core algorithm.

The update was intended to target

  • Poor on- and off-page SEO
  • Improved Searches based on Location
See also  Business Growth with Marketing Services in London

HTTPS/SSL 2014

To highlight the status of security, Google decided to give an important ranking boost to sites that properly implemented HTTPS to make the connection between website and user secure. At the time, HTTPS was presented as a lightweight ranking sign. But Google had already implied at the possibility of making encryption more important, once webmasters had the time to implement it.

Mobile Update April 21, 2015

Google’s Mobile Update (aka Mobilegeddon) guarantees that mobile-friendly pages rank at the top of mobile search, while pages not optimized for mobile are clean out from the SERPs or extremely down-ranked.

The Mobile Update furnished mobile-friendly sites a ranking benefit in Google’s mobile search results. In spite of its theatrical nickname, the mobile update didn’t immediately mess up most people’s rankings. Nevertheless, it was an important shift that indicated the ever-increasing reputation of mobile.

The update was intended to downrank the Lack of a mobile version of the page and poor mobile usability.

RankBrainUpdateOctober 26, 2015

RankBrain was the advanced Google algorithm, engaging machine learning to handle queries. It can make guesses about keywords it doesn’t know, to find words with analogous meanings and then offer appropriate results. The RankBrain algorithm investigated past searches, defining the best result, in order to improve.

The Algo was specifically designed to tackle unclear queries of users, Poor user experience (UX) and Shallow content.

PossumUpdate September 1, 2016

The Possum update guaranteed that local results differ more depending on the user’s location.The nearer you are to a business’s location, the more likely you are to see it among local results in search engine.After the update local results became more diverse, depending more on the physical position of the searcher and the phraseology of the query. Some businesses, not growing well in organic search, experienced it easier to rank locally.Possum also gave a boost to businesses located outside the physical city area.

The update was developed to determine to banish the competition in the target location.

FredUpdate March 8, 2017

This update was appeared to locate and penalize poor content. Fred targets sites that intrude upon Google’s webmaster guidelines. The mainstream of affected sites were blogs with low-quality posts that intended to be created mostly for the purpose of generating ad revenue.

2019 – BERT

This update completely revolutionized the way of websites rankings when it was launched. One in 10 websites ranking were shuffled because of this update.It was also considered as the biggest update of the decade by google.

The very hot computer science techniques like natural language processing, machine and deep learning algorithm were the base of the update. The BERT stands for Bidirectional Encoder Representations from Transformers.

BERT can identify the full context of a word by analyzing the words that come before and after it. We can say that, it uses the context and relations of all the words in a phrase, rather than one-by-one in order. This means a big improvement in understanding a search query and the intent behind it.

Analysis and Future Predictions

By the behavior of google updates, it is clear that google encourages quality content and a constant eye on fresh content. Any spam activity of short cut can do huge damage to a site that somehow manipulated the algorithms. Rankings in future will be determined by how good and white hat techniques of SEO are implemented in the blog or site.

Leave a Reply

Your email address will not be published. Required fields are marked *