How to Survive Google Updates
Every year, the search giant updates its search algorithm up to 600 times, mostly with minor changes. However, there are also regular algorithmic updates which will significantly affect search results, and are how Google stays ahead of the game.
In essence, a search algorithm is almost like a recipe, used by Google to put its web pages in order, and decide which ones to allocate the highest rankings to.
So named because it is ‘precise and fast’, Hummingbird was introduced in August 2013. It was a dramatic algorithmic rewrite, the equivalent of a new engine. It looks at more than 200 ‘ingredients’, including how important links to a page are thought to be, and quality of the page. It also helps with voice searches, and pays more attention to the exact meaning of a word in each search query. It means content needs to be high-quality and relevant. Keywords must be placed logically and not stuffed, while legitimate links still give SEO credibility.
First launched in February 2011 and named after Google engineer Navneet Panda who was behind the technology, Panda is a change to the search giant’s search results ranking algorithm and aims to push lower quality sites further down the rankings, especially so-called content farms, while returning higher quality sites higher up the search results. It affects a whole site’s ranking (or a particular section), not just individual pages.
It was launched to combat crammed links, keywords, cloned content and blacklisted sites.
A ‘slow rollout’ of Panda 4.2 began in July 2015, the seventh update since its launch.
Google released its Penguin update in April 2012 to catch websites spamming its results. Specifically, it targets those who are buying links or getting them via link networks mainly aimed at boosting Google rankings. With the release of a new Penguin update, sites which have acted to get rid of poor links may regain their ratings. The website can then be resubmitted for consideration.
Google itself estimates that Penguin affects just over 3% of searches made in English.
This is aimed at providing better, more relevant local searches and was released in July 2014. It increases the ranking of local listings. It changes local listings in search results, and gives preference to local directory sites. Many businesses with a strong local presence in their area have found it helpful.
Released in June 2013, this algorithm is aimed at cleaning up search results for historically ‘spammy’ queries including payday loans, pornographic and other queries that are heavily spammed. It targets websites using spam techniques to boost rankings for these terms.
This filter was introduced in the summer of 2012 to stop websites with multiple copyright infringement reports from doing well in Google listings. It is regularly updated, allowing those websites which have made the right improvements to escape penalty. It was followed in February 2017 by a ‘landmark agreement’ between Google and Microsoft’s Bing for reducing ‘the visibility of infringing content’ in UK search results. The search giant also said that this agreement would not change its Pirate filter.
At Page1, we’re well-versed in all these updates and more, and provide a multidisciplinary service so that you meet your SEO targets.