In order to understand what a Google algorithm is, let’s first see what algorithms are in general. The standard definition is: a set of rules for solving a problem in a finite number of steps.
Google’s algorithms are a complex system used to retrieve data from its search index and instantly deliver the best possible results for a query.
The order of websites that results from a keyword query in Google is not a coincidence. It depends on a series of algorithms put in place by the firm. These are what determine the positioning of websites and blogs on the results page.
It can be compared to a mathematical treatment that applies to web pages. These are formulas created by the teams of the Google firm. They make it possible to identify which sites hold information that is both relevant and of good quality.
They are the ones who allow or not a page to be easily found by Internet users. Updates are made regularly due to changes in the behavior of Internet users on the web. This results in adding new ones or revising existing ones. The result of the queries can then be significantly modified; this is called the “Google Dance” effect.
Google does not use a single algorithm but several. Their role is to classify websites according to the quality of their content but also their structure and speed of display. As far as content is concerned, they will ensure that the website does not contain text copied from other sources (duplicate content). Similarly, a site or a blog must be optimized for mobile devices (smartphone, tablets) with a fast display speed.
A Google Update occurs when Google makes significant and extensive changes to its algorithm and search engine systems. These updates aim to improve the search experience for users, providing more relevant, useful and reliable content.
Google makes minor changes throughout the year to its various algorithms.
In general, core updates occur several times a year and are subject to confirmation by Google.
Deployment date: February 2011
The Panda Algorithm and its penalties are introduced by Google in an effort to combat websites created for spamming purposes. This is a search filter that penalizes low quality sites and blogs. The very first deployment of Panda generated a major impact, with a 12% alteration of the SERPs in the United States alone.
Since then, it has been regularly updated without warning, which allows websites that have been previously penalized to find a better place, following the changes made.
The Penguin algorithm to fight against abusive netlinking
Deployment date: April 2012
This is an algorithm widely commented on the internet, and particularly feared by Black Hat webmasters. Penguin’s main purpose is to penalize the natural referencing of sites that do not respect the recommendations and directives in terms of purchase, creation or link networks. All contentious links should be removed. The fact is that this task requires hard work that sometimes stretches over several months or years. Webmasters therefore fear it because it involves sometimes significant and uncertain work to get out of a penalty.
On the other hand, Penguin is now added to Google’s main algorithm, resulting in an immediate change for SEO recovery.
Hummingbird to better understand user searches
Deployment date: September 2013
Here is one of the most important algorithms that Google has created, because of the impact it has on the way searches are carried out. By integrating the Hummingbird (hummingbird) algorithm , Google can now better understand people’s searches as a whole, considering all terms in their entirety.
So it’s better quality research and so are the answers. Previously, it was not possible to submit a specific search. Today, a query can take the form of a complete sentence or a question. Google can then offer relevant results.
With Rankbrain, Google is moving towards the deployment of artificial intelligence
Deployment date: October 2015
With Rankbrain , Google is moving towards the deployment of artificial intelligence. It has the particularity of being able to understand the meaning of several requests formulated in different ways.
The answers given to two searches written differently are therefore ultimately very similar. It completes or extends Hummingbird’s own capabilities. Today, Google considers Rankbrain to be in the top 3 of the most important SEO factors (in addition to links and content quality).
The functioning of Rankbrain is quite particular. It is mostly done offline. Over time, Google feeds it with old search files. In this way, this tool manages to make predictions. Once these have been duly tested, they are then integrated for an application in the search engine.
Deployment date: April 2015
This is another algorithm that has had a huge impact on web page SEO. With the democratization of the use of smartphones and tablets, the firm has deployed Mobile Friendly.
It checks that the website is also compatible with mobile devices, and takes this element into account for the result of natural referencing. These criteria have been strengthened with the introduction of the mobile first index.
Since 2010, the Google firm has launched different algorithms whose role is generally specific. Here are the most important ones, as well as their characteristics.
PageRank is how Google decides which websites to show first when you search for something. It’s like a popularity score for websites, where higher scores mean more important and trustworthy websites. Google keeps the exact formula they use to calculate PageRank a secret, but they are always making it better by adding new ways to analyze websites.
Deployment date: June 2010
Caffeine allowed Google to index pages instantly. This is a major overhaul in the page indexing system. Since rolling out Caffeine, results can include articles and content that are 50% newer than before.
Deployment date: January 2012
The firm has also decided to penalize websites with an abnormal load of advertisements. Top Heavy therefore aims to identify the sites concerned and penalizes them at the SEO level. However, the deployment of Top Heavy only had an impact on 1% of the sites.
Deployment date: August 2012
This search filter eliminates web pages offering illegal downloads, whether music, films or series. It thus makes it possible to fight against the infringements of copyright. It is regularly updated.
Deployment date: September 2012
Many websites choose a domain name that is quite poor in quality. In this way, they want to increase their chance of being well referenced in Google, according to the searches carried out by Internet users.
However, some webmasters have not hesitated to create optimized domain names to improve their natural referencing, and this despite the fact that the domain name and the content of the site do not coincide. In response to this phenomenon, Google has therefore deployed the EMD algorithm, in order to penalize the offending sites.
Deployment date: June 2013
The role of the Payday Loan algorithm is to remove spammy search results. It thus improves the quality of the pages in general. By spam, we mean, for example, adult content, counterfeit articles, credits or even online gaming sites.
Deployment date: July 2014
Thanks to Pigeon, the searches carried out favor local results. Thus, users can know the solutions close to their home. It applies to both the search engine and Google Maps. The impact is particularly important for local companies (commercial sector, catering, service providers, etc.).
Deployment date: August 2014
With it, Google brings a secure browsing experience to Internet users. It is based on the HTTPS security protocol , which has been generalized since 2017. Therefore, all non-compliant websites are presented as “unsecured” on internet browsers.
Deployment date: May 2015
Its particularity lies in the fact that it was deployed in secret by Google. Many webmasters, having noticed major changes in the SERPs, turned to Google’s quality teams. They first deny, then confirm after a few weeks that they have deployed an update. Baptized by the Phantom web community, it is renamed “Quality” by Google. Its main mission is, like Panda, to improve the quality of website publications and to fight against spam by penalizing sites with mediocre content.
Deployment: May 2015
Some sites have so-called satellite pages, with low quality content. With it, Google identifies the sites concerned and penalizes them. It detects duplicate or low value texts.
Deployment: January 2017
Google pays even more attention to the quality of sites by deploying this algorithm which detects interstitial and pop-up banners. An initiative that improves the reading quality of web content, with action page by page.
Google’s Fred Algorithm
Deployment: March 2017
If it does exist, its precise purpose remains a real mystery. It would be a sophisticated combination of Panda and Penguin, possibly fighting against low quality sites, without optimized content or created with the sole purpose of monetization.
Once the query has been launched in the search engine, the sites displayed in the different blocks are the result of numerous criteria defined by the search engine. Here are the 8 basic levers to give the site the best chance of obtaining a good SEO:
The content of the site
This is a priority, the Internet user must be able to precisely find the answer to his request. It is therefore essential that the content of the site is of high quality and relevant, embellished with multimedia content.
Placement of editorial links
Links must be judiciously placed in the content of the pages. It is necessary to integrate a reasonable number of them and that the links make sense for Internet users and for Google. Avoid internal and external networking without added value.
A good user experience
If the user spends time on the website, there is a good chance that it will meet their needs. On the other hand, in the event of a high bounce rate, the webmaster takes the risk of seeing his positioning deteriorate. Indeed, a quick visit to a site can mean that the user is not satisfied with the content.
A wise choice of keywords
The use of adequate keywords improves the click rate as well as the positioning and the relevance of the traffic. Hence the importance of using the right keywords and of working on the semantics by using all the terms that best frame these keywords and themes.
A quality domain name
Google ensures that the domain name is consistent with the content of the site. It is an essential quality criterion, which can give a better positioning to the pages.
A responsive site adapted to mobile devices
The loading time of web pages must be fast, and the version of the site must be optimized for tablets and smartphones. For Google, these are the elements that guarantee a good user experience.
The site must be accessible by HTML links and contain text, so that it can be accessed by Google (the crawl). The site must be fast and well structured to allow Google to discover the most important pages of your site.
Keep in mind that Google rankings can be totally personalized. Depending on the location, time, type of medium and search history, the sites displayed may vary.
Behind a search result on the Google engine, many factors and algorithms are in action continuously. It is important to keep these things in mind when creating a blog or website. If the list of algorithms above is not exhaustive, it reveals the main leads that lead Google to rank well or to penalize certain websites.
Understanding the factors sought by the search engine is of major importance in transforming Google into a tool for visibility and the creation of targeted traffic.