There’s a New Stranger in Town, and His Name is Fred
If you’ve been paying attention to search engine result pages (SERPS) in the past few weeks, you’ll have noticed that a large number of websites have suffered severe blows to their Google Rankings. Perhaps you’ve even been one of the unfortunate site webmasters to experience these changes to your website? You may have lost a lot of natural traffic or have lost rankings for your main key terms. Even if a single page or segmented pages on your site have been impacted in or around 9 to 19 March, you have been hit by Google’s update cycle more prominently the Fred Algorithmic update.
Well, I can tell you why, and his name is Fred. Or at least that’s what we’re calling it until Google says otherwise.
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) March 9, 2017
If you are one of the people who has had their ear firmly planted to the ground, then you already knew this, and are just looking for any new information on the mysterious new update. However, maybe you are one of much more who hasn’t been paying much attention since the introduction of Penguin in 2012. If this is the case, then you might have found your way to our site in a desperate panic as you watched your web traffic plummet. Well, let me just say that this time, the search bar has worked in your favor.
First off, this can all get pretty technical, so I’ll give a quick rundown of what’s happened in the SEO space leading up to Fred’s arrival.
Since the beginning of google.com as a search engine, Google has increasingly added and developed ways to monitor and improve their search algorithm, called updates. While there have been dozens of updates since it’s inception in 2001, I’m going to focus only on those that directly affect SEO results.
Stanford’s original Google page is the go-to guide Give the full-page a read. The best 20 mins you will spend.
DYK that after 18 years we’re still using PageRank (and 100s of other signals) in ranking?
— Gary Illyes ᕕ( ᐛ )ᕗ (@methode) February 9, 2017
After a while, it became apparent that the complexity and granularity of web-browsing would require similarly complex updates. Before these came along, websites of low-quality or even manipulative nature would be found too quickly during an internet search. Google’s aim with their search engine has always been to bring high-quality content and websites to their users. Google figured the best way to do this was to rank sites based on many different factors; this is called Google Rankings. Eventually, Google developed Algorithm updates to make significant changes to how these rankings were defined and released them in a periodic fashion. While there were a lot of significant updates to the algorithm, the first major change came when they started to release updates specifically designed to penalise websites that didn’t adhere to Algorithm friendly practices. The first major update of this nature was called Google Panda.
Google’s other updates
When Google names an update, it’s a pretty good indication that it’s going to have significant ramifications. Released in February 2011, Google Panda was the first major update that acted as a penalty within the search algorithm. While the concept of quality is not so easily defined, Google started there when it came to Algorithm enforcement. Panda aimed to lower the rankings of low-quality (or, thin) sites, and return better sites to the top. For Panda, this was less based on whether a site was ‘good’, and more on whether the content is relevant to the user’s search query. Panda determined this by focusing on things like plagiarism; pages with little, or no content; or content difficult to understand without context (adding elaborative pages like an FAQ could help with this).
Panda works by creating a ratio with a site’s inbound links, and reference queries (search queries for the site’s brand). This ratio is then used to create a site-wide modification factor. The factor is then used to create another level of modification factor for a page, based upon a search query. If the quality of this page fails to meet a certain threshold, the factor is applied, and the page is dropped lower in the Google rankings.
Because of the nature of its enforcement, Panda recognises the problem as the overall approach of the content creators, so it penalises the whole site. As a result, after a website has been flagged by Panda, to get out of its bear hug of death, the webmasters must rethink their approach to the site as a whole.
For the first two years, Panda’s updates were rolled out about once a month, and there were over 24 updates. In 2013 Google stated that future updates would be integrated into the Algorithm, making them less noticeable and continuous.
With the introduction of Panda, Google saw a significant improvement in the Google rankings, and higher-ranked sites had much higher-quality content. However, people soon found other ways to earn the ire of Google, without necessarily coming into conflict with the factors that Panda enforces.
Google’s Fred algorithmic update
After figuring out how to appease Panda, webmasters would soon work out ways to get unnaturally higher in the rankings by targeting other areas of the search algorithm, specifically optimisation. This was done by incorporating over-optimisation techniques, also called black-hat techniques. Black-hat techniques include methods that game the system, like duplicate content, keyword stuffing, cloaking, and doorway pages.
Such practices resulted in web-pages that, while not necessarily of low content enough to be affected by Panda, were creating other problems that interfered with a consistent and useful browsing experience. Aforementioned could include having ads plastered all over the page, or the excessive use of keywords to link to individual sites and build revenue. In a word, the use of Blackhat techniques produced the web page equivalent of spam, called web spam.
In response, Google implemented a spam blocker. The implementation of such algorithmic change was the second major search algorithm Penalty update, which they named Penguin. Penguin looks at the amount of Black-hat techniques present in a webpage and flags the page if it is too extraneous. Unlike Panda which sees the issue as related to content, Penguin sees the problem as the choices of the site runners. As a result, a part of a website flagged by Penguin will only affect that particular page. Unless the issues are rampant across the entire site, the rest should be okay.
Google’s John Mueller said that a Penguin penalty is based on the ratio of good to bad techniques. If there are still bad techniques present, but there are far more good than bad, the site can still get past the effects of Penguin.
Penguin has had 7 updates since its launch in April 2012, with the latest build being Penguin 4.0.
— Steven Broschart (@optimizingexp) May 21, 2014
What do we know so far?
With Panda and Penguin now pretty much covering all bases when it comes to dodgy SEO, it seemed that Google’s Algorithm had been in a pretty good spot. For a while, only occasional, small updates to the two algorithm-enforcers were brought out (aside from a scary ghost at one point) and everyone seemed to be following the rules. Google’s internet was a veritable garden of prosperity
Fred search update impact
That brings us to the now. On March 8th, 2017, webmasters experienced catastrophic fluctuations to their Google rankings, those affected were dubbed ‘Losers’. Notably, there were a lot of reasonably high-profile Losers, like thefactsite.com, entrancegeek.com, lookingformaps.com. These lead people to believe that a major update had been rolled out. Not much is known about the update, nor have Google officially confirmed that an update was even released. However, they also haven’t denied it. With such drastic effects on the rankings, webmasters were panicking, and Google decided to at least give the update a name.
We were lucky to get even that. There are more than 600 Algorithm updates every year, and Google announces a tiny amount of them. The name of the update was only gained because, Barry Schwartz, a dedicated information collector of over 13 years, reached out to Google. In response, Google’s Gary Illyes tweeted back: “sure! From now on every update, unless otherwise stated, shall be called Fred”. What was meant to be a joke, has now stuck as the update’s ‘official’ name? Clearly, this isn’t something that was supposed to be as big a deal as it has become.
Fred appears to be a bit like a phenomenon that was ominously referred to as ‘Google Phantom’ – the aforementioned ghost. Around the period of May 9th, there were a lot of reports of significant changes to the Algorithm, resulting in many sites reporting major traffic loss. However, the change was never given any details in an ‘official’ capacity. What was likely meant to be an unremarkable update to Panda – the effects of Phantom were very similar to the content-focused targets of the penalty– became a moment that the internet decided to name due to its significant impact. The name ‘Phantom’ was widely accepted, due to the apparitious nature of the update.
Like Phantom, Fred has come out of seemingly nowhere. Apparently, Google was never fond of the ‘Phantom’ name. Perhaps this time, they have decided to come out and name the sudden update as a way to control the message better?
The interesting thing about Fred is that it shares characteristics with Penguin, rather like Phantom does with Panda. Losers to Fred’s wrath have been websites that contain a lot of web-spammy pages that are there with the goal of ad-revenue only. These sites have lots of ads, especially banner ads, and are implemented with the controversial Adsense campaign. The difference is that Fred has had a much more severe effect than Penguin, with some websites in the top 10 of the Rankings, being put all the way out of the top 100.
In the past, there have a couple of ways to get out of the penalty that Panda and Penguin enforce.
One method is exclusive only to Penguin, presumably due to its single page-targeting, rather than a site-wide effect. Google has maintained a feedback form for Penguin, which allows people to do two things. Users can report web spam that they feel should have been flagged, but hasn’t, and still appears high in the Rankings. The other allows site runners to submit their website for reconsideration through Google Webmaster Tools. The site is then able to be let go by Penguin’s icy flipper, assuming that enough black-hat techniques have been addressed.
Recovery from Fred algorithm update
The only other way to get a website back to the high rankings is simple. Webmasters and content creators must address the issues that got them penalized in the first place. When a new update comes out, it overrides the previous one. This means that, as long as everything is adhering to good SEO practices when a new update is rolled out, a previously penalized website is in the clear. Sometimes the effects of other updates persist – there were some unfortunate sites hit with what users called Phantenguin (Phantom + Penguin) – however that was likely due to the unusual circumstances of the phenomenon.
Here are some questions to answer if you have been hit.
- Does your site provide thin content?
- Is it heavy on ads?
- Does it provide intrusive popups making it unusable on the mobile device?
- Are there too many dynamic pages that add no value? ( For an example, eCommerce colour and size pages, WordPress category pages without unique content…)
- Is your site a part of a larger link network?
- Are your robots.txt and .htaccess still allowing crawl? (If robot.txt is not accessible Google will abandon the crawl )
- Is your site slow? ( User behavior and interaction improves the overall positions )
- Have you moved your site recently or updated the design?
- Have you done a basic site audit recently?
- Is your site still fully indexed? ( go to Google and in the search bar type site:yoursite.com )
- Have you submitted your site to webmaster tools? ( Have you got BING webmaster tools? See the issues there )
- Does the webmaster tools show any specific errors or increase in errors since the update date?
- Is there depth to your content for head queries with large key search volume?
- Is you site heavy on java script or uses it too much for content rendering?
Quick update on 3rd May. A user has suggested some additional points for recovery
- DISAVOW our link profile. ( Can’t emphasis this enough )
- Contact webmasters that had these links and ask them to delete the links or add a no-follow tag.
- Observing the access logs and figuring out that Google is abandoning some depth pages or hasn’t visited them in over 3 months. Improving access to these pages and improving internal linking.
Now Fred is taking charge, and Google hasn’t released any official word at all, much less information about a recovery method. It would seem the best bet is to do what has worked in the past. That means rethinking and reworking all SEO elements across your site. Unfortunately, good optimization is a complicated and involved process. If it was easy, no one would ever be penalized.