The evolution of Google’s Search algorithm – from 2003 till 2014

We have seen how some of the factors that influence your websites ranking on search engines are public. Google provides information to webmasters that helps to optimize the websites. Google also announces some changes and developments that it constantly keeps doing to its algorithm.

Understanding of Google’s algorithm forms the backbone of any online marketing activity. That’s why it is important to see how the Google algorithm has evolved in recent years.

From 2003 to 2007, I will highlight on the main changes of the algorithm such as penalizing unnatural techniques of optimizing web pages.

The algorithms from Google called “Dance” and “Florida” in 2003 hit the Websites that were using the technique of keyword stuffing, which is the repetition of words in an unnatural way inside a web-page.

2004, through the new algorithm called “Austin“, is dedicated to penalizing of Black Hat SEO techniques, such as hide Keywords within the web pages using minimum font size or for keywords hidden by using the same color as the page background, or, to repeat them several times in the text without disturbing the reading flow of the visitor.

The algorithm “NoFollow” in 2005 as the name suggests, introduced the attribute “NoFollow” that tells the search engine not to index this link (for example, a link in the comment of your blog post)

A major revolution in Google search results from 2005 was the update that affects the Local Maps. Depending on the results of research and in relation to the location, you get Local SEO results.

Also interesting are the following algorithmic changes that took place between 2005 and 2006. They were meant to clean up the inbound links and penalizing purchased reciprocal links.

Since 2007, in addition to updates of algorithm, Google came up up with a number of changes in the structure of its search at the user side. This also modified the SERP results in a big way. Those changes came in 2007 through the Universal Search, which Google adds in the results of search options “news”, “images”, “video” and in 2008 with the inclusion in the search box, the suggestions in real time, while the user is typing the search string.

Staying on the front of the algorithms, the first great revolution comes with the algorithm “Caffeine” of 2009 that will speed indexing and positioning in near real-time.

From the moment you start to talk about SEO in real-time, many new horizons opened especially to advertise events or contests.

Since the end of 2010, all the factors associated with the use of social networks such as the Signal of 2010 with which Google starts to use feedback on social media to increase the ranking of positioning or the famous Google+1 Button that anticipates the arrival of Google Plus and its impact on the search engine ranking.

In addition to the social network, Google increasingly refined itself by trying to search results that reward quality content. For this reason, with the arrival of algorithm “Panda” in 2011, sites with low quality content started being penalized.

A new Google update called algorithmic “Freshness“, again in 2011, rewarded sites which updated content frequently.

In 2012, the big news was the new update algorithmic “Penguin” which penalized even more backlinks and keyword stuffing of dubious origin. With the introduction of the Google Search Plus, search results started getting determined by the degree of authority and social connections with the Google Plus social network.

This integration is tied hand-in-glove with the future of Google research that goes in increasingly linking to the ‘so-called’ ‘semantic web’ – made of real content and not of computer techniques. This demonstrated by the update “Knowledge Graph” which adds additional information in the results as linked images, date of birth of the author, video link etc.

This way, Google wants to integrate and understand the real dynamics that are born and evolve between people in the life of every day. Google has been relentless in penalizing and excluding more and more content born by artificial or fictitious sources.

The evolution of these concepts has been fully integrated by the algorithm Hummingbird.

In September 2013, Hummingbird brought a new revolution in the Search Engine Optimization industry. Hummingbird was unique in the sense that while it still used some components from the previous algorithms (such as the analysis of Backlinks, Penguin or duplicate content and low quality with Panda), it was still designed as a completely new algorithm that interprets the data available and provides a more precise response to user requests.

I will be covering the impact and operation of this specific algorithm, in detail in my subsequent posts. To stay up to date on continuous updates of the algorithm Google: http://moz.com/google-algorithm-change

Hummingbird lead to customization of results based on the research of the individual user. Now Google does not make a global search in its entire database, but it extrapolates the results depending on the topic and intentions of who is searching, personalizing the results for each user.

It therefore, becomes very important to understand how a person performs a search on Google and the first key is to understand if this search is being done already connected to your Google account, or not.

The Google Account is a unified login system that gives you access to all free Google products (Gmail, Google Hangouts). Web history is private, but it is used by Google to customize searches so that the user can get a better response to his query.

Google’s goal is indeed to anticipate user needs and to provide a response based on usage “historical” search. It means that if we make a Google search by typing time the word “Fishing” the first and then click at a web page that refers to the fruit, the next time Google will insert web pages that refer to the fruit of the peach tree and not fishes in the top positions of “our” SERP results!

Google does this based on research carried out with a Google account, the change in function of the location where they are made, detecting the location of the IP address from which you access the web.

IP address (Internet Protocol address) is a numeric label that allows uniquely identification of a specific computer or any other device network or a network, recognizing its geographical origin.

At this point the question might be: how can we verify the true position of our website to check in the SERPs if searches depend on the location and the web history or from the research previously carried out?

The solution is there and is very simple: just add the parameter & pws = 0 in the URL of the results of Google. Google will display results without customizations.

Year 2014, witnessed further updates in Panda, Penguin and Humming Bird. Authorship mark-up disappeared from the search results and Google penalized websites that used too much ad-space. Another notable Google Algorithm update was Pirate 2.0. This was aimed at combating p2p websites that propagate software and digital media piracy.

In all these years, the aim of every algorithm update has been to provide the user the best of search results. Google is also not perfect, it has stumbled at times but has been smart to recover and it has become increasingly smart in identifying web-spam. It is therefore important to remember that unethical online marketing methods should not be followed.

Leave a Reply 0 comments

Leave a Reply:







9899791337