Blog

Apr 03

Google’s Fred Algo Update – Causes, Mitigation & Recovery For Webmasters.

There’s a New Stranger in Town, and His Name is Fred

If you’ve been paying attention to search engine result pages (SERPS) in the past few weeks, you’ll have noticed that a large number of websites have suffered severe blows to their Google Rankings. Perhaps you’ve even been one of the unfortunate site webmasters to experience these changes to your website? You may have lost a lot of natural traffic or have lost rankings for your main key terms. Even if a single page or segmented pages on your site have been impacted in or around 9 to 19 March, you have been hit by Google’s update cycle more prominently the Fred Algorithmic update.

Well, I can tell you why, and his name is Fred. Or at least that’s what we’re calling it until Google says otherwise.

If you are one of the people who has had their ear firmly planted to the ground, then you already knew this, and are just looking for any new information on the mysterious new update. However, maybe you are one of much more who hasn’t been paying much attention since the introduction of Penguin in 2012. If this is the case, then you might have found your way to our site in a desperate panic as you watched your web traffic plummet. Well, let me just say that this time, the search bar has worked in your favour.

First off, this can all get pretty technical, so I’ll give a quick rundown of what’s happened in the SEO space leading up to Fred’s arrival.

Since the beginning of google.com as a search engine, Google has increasingly added and developed ways to monitor and improve their search algorithm, called updates. While there have been dozens of updates since it’s inception in 2001, I’m going to focus only on those that directly affect SEO results.

Stanford’s original Google page is the go-to guide  Give the full-page a read. The best 20 mins you will spend.

 

After a while, it became apparent that the complexity and granularity of web-browsing would require similarly complex updates. Before these came along, websites of low-quality or even manipulative nature would be found too quickly during an internet search. Google’s aim with their search engine has always been to bring high-quality content and websites to their users. Google figured the best way to do this was to rank sites based on many different factors; this is called Google Rankings. Eventually, Google developed Algorithm updates to make significant changes to how these rankings were defined and released them in a periodic fashion. While there were a lot of significant updates to the algorithm, the first major change came when they started to release updates specifically designed to penalise websites that didn’t adhere to Algorithm friendly practices. The first major update of this nature was called Google Panda.

Google’s other updates

When Google names an update, it’s a pretty good indication that it’s going to have significant ramifications. Released in February 2011, Google Panda was the first major update that acted as a penalty within the search algorithm. While the concept of quality is not so easily defined, Google started there when it came to Algorithm enforcement. Panda aimed to lower the rankings of low-quality (or, thin) sites, and return better sites to the top. For Panda, this was less based on whether a site was ‘good’, and more on whether the content is relevant to the user’s search query. Panda determined this by focusing on things like plagiarism; pages with little, or no content; or content difficult to understand without context (adding elaborative pages like an FAQ could help with this).

Panda works by creating a ratio with a site’s inbound links, and reference queries (search queries for the site’s brand). This ratio is then used to create a site-wide modification factor. The factor is then used to create another level of modification factor for a page, based upon a search query. If the quality of this page fails to meet a certain threshold, the factor is applied, and the page is dropped lower in the Google rankings.

 

Because of the nature of its enforcement, Panda recognises the problem as the overall approach of the content creators, so it penalises the whole site. As a result, after a website has been flagged by Panda, to get out of its bear hug of death, the webmasters must rethink their approach to the site as a whole.

For the first two years, Panda’s updates were rolled out about once a month, and there were over 24 updates. In 2013 Google stated that future updates would be integrated into the Algorithm, making them less noticeable and continuous.

With the introduction of Panda, Google saw a significant improvement in the Google rankings, and higher-ranked sites had much higher-quality content. However, people soon found other ways to earn the ire of Google, without necessarily coming into conflict with the factors that Panda enforces.

Google’s Fred algorithmic update

After figuring out how to appease Panda, webmasters would soon work out ways to get unnaturally higher in the rankings by targeting other areas of the search algorithm, specifically optimisation. This was done by incorporating over-optimisation techniques, also called black-hat techniques. Black-hat techniques include methods that game the system, like duplicate content, keyword stuffing, cloaking, and doorway pages.

Such practices resulted in web-pages that, while not necessarily of low content enough to be affected by Panda, were creating other problems that interfered with a consistent and useful browsing experience. Aforementioned could include having ads plastered all over the page, or the excessive use of keywords to link to individual sites and build revenue. In a word, the use of Blackhat techniques produced the web page equivalent of spam, called web spam.

In response, Google implemented a spam blocker. The implementation of such algorithmic change was the second major search algorithm Penalty update, which they named Penguin. Penguin looks at the amount of Black-hat techniques present in a webpage and flags the page if it is too extraneous. Unlike Panda which sees the issue as related to content, Penguin sees the problem as the choices of the site runners. As a result, a part of a website flagged by Penguin will only affect that particular page. Unless the issues are rampant across the entire site, the rest should be okay.

Google’s John Mueller said that a Penguin penalty is based on the ratio of good to bad techniques. If there are still bad techniques present, but there are far more good than bad, the site can still get past the effects of Penguin.

Penguin has had 7 updates since its launch in April 2012, with the latest build being Penguin 4.0.

What do we know so far?

With Panda and Penguin now pretty much covering all bases when it comes to dodgy SEO, it seemed that Google’s Algorithm had been in a pretty good spot. For a while, only occasional, small updates to the two algorithm-enforcers were brought out (aside from a scary ghost at one point) and everyone seemed to be following the rules. Google’s internet was a veritable garden of prosperity

Fred search update impact

That brings us to the now. On March 8th, 2017, webmasters experienced catastrophic fluctuations to their Google rankings, those affected were dubbed ‘Losers’. Notably, there were a lot of reasonably high-profile Losers, like thefactsite.com, entrancegeek.com, lookingformaps.com. These lead people to believe that a major update had been rolled out. Not much is known about the update, nor have Google officially confirmed that an update was even released. However, they also haven’t denied it. With such drastic effects on the rankings, webmasters were panicking, and Google decided to at least give the update a name.

We were lucky to get even that. There are more than 600 Algorithm updates every year, and Google announces a tiny amount of them. The name of the update was only gained because, Barry Schwartz, a dedicated information collector of over 13 years, reached out to Google. In response, Google’s Gary Illyes tweeted back: “sure! From now on every update, unless otherwise stated, shall be called Fred”. What was meant to be a joke, has now stuck as the update’s ‘official’ name? Clearly, this isn’t something that was supposed to be as big a deal as it has become.

Fred appears to be a bit like a phenomenon that was ominously referred to as ‘Google Phantom’ – the aforementioned ghost. Around the period of May 9th, there were a lot of reports of significant changes to the Algorithm, resulting in many sites reporting major traffic loss. However, the change was never given any details in an ‘official’ capacity. What was likely meant to be an unremarkable update to Panda – the effects of Phantom were very similar to the content-focused targets of the penalty– became a moment that the internet decided to name due to its significant impact. The name ‘Phantom’ was widely accepted, due to the apparitious nature of the update.

Like Phantom, Fred has come out of seemingly nowhere. Apparently, Google was never fond of the ‘Phantom’ name. Perhaps this time, they have decided to come out and name the sudden update as a way to control the message better?

The interesting thing about Fred is that it shares characteristics with Penguin, rather like Phantom does with Panda. Losers to Fred’s wrath have been websites that contain a lot of web-spammy pages that are there with the goal of ad-revenue only. These sites have lots of ads, especially banner ads, and are implemented with the controversial Adsense campaign. The difference is that Fred has had a much more severe effect than Penguin, with some websites in the top 10 of the Rankings, being put all the way out of the top 100.

In the past, there have a couple of ways to get out of the penalty that Panda and Penguin enforce.

One method is exclusive only to Penguin, presumably due to its single page-targeting, rather than a site-wide effect. Google has maintained a feedback form for Penguin, which allows people to do two things. Users can report webspam that they feel should have been flagged, but hasn’t, and still appears high in the Rankings. The other allows site runners to submit their website for reconsideration through Google Webmaster Tools. The site is then able to be let go by Penguin’s icy flipper, assuming that enough black-hat techniques have been addressed.

Recovery from Fred algorithm update

The only other way to get a website back to the high rankings is simple. Webmasters and content creators must address the issues that got them penalised in the first place. When a new update comes out, it overrides the previous one. This means that, as long as everything is adhering to good SEO practices when a new update is rolled out, a previously penalised website is in the clear. Sometimes the effects of other updates persist – there were some unfortunate sites hit with what users called Phantenguin (Phantom + Penguin) – however that was likely due to the unusual circumstances of the phenomenon.

Here are some questions to answer if you have been hit.

  1. Does your site provide thin content?
  2. Is it heavy on ads?
  3. Does it provide intrusive popups making it unusable on the mobile device?
  4. Are there too many dynamic pages that add no value? ( For an example, eCommerce colour and size pages, WordPress category pages without unique content…)
  5. Is your site a part of a larger link network?
  6. Are your robots.txt and .htaccess still allowing crawl? (If robot.txt is not accessible Google will abandon the crawl )
  7. Is your site slow? ( User behaviour and interaction improves the overall positions )
  8. Have you moved your site recently or updated the design?
  9. Have you done a basic site audit recently?
  10. Is your site still fully indexed? ( go to Google and in the search bar type site:yoursite.com )
  11. Have you submitted your site to webmaster tools? ( Have you got BING webmaster tools? See the issues there )
  12. Does the webmaster tools show any specific errors or increase in errors since the update date?
  13. Is there depth to your content for head queries with large key search volume?
  14. Is you site heavy on javascript or uses it too much for content rendering?

Quick update on 3rd May. A user has suggested some additional points for recovery

  1. DISAVOW our link profile. ( Can’t emphasise this enough )
  2. Contact webmasters that had these links and ask them to delete the links or add a no-follow tag.
  3. Observing the access logs and figuring out that Google is abandoning some depth pages or hasn’t visited them in over 3 months. Improving access to these pages and improving internal linking.

Now Fred is taking charge, and Google hasn’t released any official word at all, much less information about a recovery method. It would seem the best bet is to do what has worked in the past. That means rethinking and reworking all SEO elements across your site. Unfortunately, good optimisation is a complicated and involved process. If it was easy, no one would ever be penalised.

 

13 Comments

  1. Jerry
    April 4, 2017 at 8:08 am · Reply

    Fred’s update is penalising the low content ad heavy sites. Its just an extension to the original penalty that was put on such sites.Its Google’s way of coming down heavy on parameters based sites generating a large amount of pages and wasting the crawler’s time.

    • Weboptimizers
      April 4, 2017 at 8:13 am · Reply

      Do you have any data on the parameters based sites ? From what we have observed it has the biggest impact on low quality , rehashed content, ad heavy or proverbial “no value” sites.Google webmaster web console will deal with parameters in a much more comprehensive way and would be able to detect it quickly.

  2. Carlson
    April 4, 2017 at 8:10 am · Reply

    Fred’s penalty also has some other big loosers that gamed the system based on rehashed and reposted content. I won’t be surprised if reddit or other aggregators also went through some loss in traffic.

  3. J Nordic
    April 4, 2017 at 8:11 am · Reply

    It’s not called Fred. @Methode just said that because he calls everything with no name as fred.

  4. Sue Corrigan
    April 4, 2017 at 8:18 am · Reply

    Nothing says “loss of business” these days more than being blacklisted or going through such motions with Google’s algo update.

    We had a very good high ranking site with all the best practices that has been hit by this update. We don’t do any low value content , we have a great and meaningful content strategy and we audit each and every aspect of our website every month.

    We have been trying to ascertain the cause of the drop and it seems that there is nothing that seems out of the ordinary even in our link profile.

    I would love to know if there is a way to find out the possible causes that may have impacted our site.

    • Weboptimizers
      April 4, 2017 at 8:19 am · Reply

      Hi Sue,

      We will be able to help or do an audit on your site to find out what happened.
      Could you please reach out to us on the phone number provided in the header.

  5. Jason
    April 4, 2017 at 1:30 pm · Reply

    Low value,low content sites targeted.
    Remember its not the only single algo update it had a few things run at that date.

    There were number of updates that ran through that date and penalised every single sites that had a range of issues that were a problem.

    Focus on good quality content with engagement.

  6. Gregory
    April 6, 2017 at 6:46 am · Reply

    Yes sites with low value and low content.

    Remember this is google rank brain kicking in and I would be surprised if all commercial tools that keep a track of this data will be out of business soon. There are tools that keep a track of when there are massive fluctuations.

    The truth is most of this data is superficial.Google has large data set and any manipulation at lower level is just a small sample to get biased results.

    See a illuminating comment on HN a while ago.

    ____________

    Yeah exactly – both that square roots are good for smoothing but more so that a it’s trial and error thing. When I started out after dropping out of grad school I was working on search algorithms and a bunch of my coworkers had a background in informational retrieval and I did not, and I was kind of thinking about it in the wrong way. I was thinking about it like it was a math puzzle and if I just thought really hard it would all make sense.

    So one day I was stumped in how to make this signal useful and Amit suggested, hey why don’t you take the square root. I was like, why a square root, it doesn’t make much sense, nothing is getting squared ever, and he went up to the whiteboard and just drew a square-root-ish-wiggle-arc-shape and said look you just want something that looks like this. Square root, log, whatever.

    I was like, oh… am I allowed to write code that doesn’t make any sense? I thought I wasn’t supposed to do that. And he was just like, well, just don’t worry about it, you are overthinking it, you can take all the square roots you want, multiply by 2 if it helps, add 5, whatever, just make things work and we can make it make sense later.

    At that point I realised that real world software engineering was much different than research had been, and also that this was going to be way more fun.

    ——-

    Ton’s of people write articles on devaluing links and fred hit low value low content stuff. No one talks about the elephant in the room.

    The best way to recover is to look at link diversity that is after you have a decent site.

  7. David Blaine
    April 9, 2017 at 12:40 pm · Reply

    I’d add that SEO is not just a ranking exercise– it’s a conversion exercise. Fred update and all the updates that come henceforth will only get stronger if you understand that Google is going to utilise more and more data from G analytics to sort of introduce more measures and learn about usability.

    On top of all of the factors that you need to consider for ranking, you should consider that the first ~60 characters of you tag and the first ~150 of your meta-description are what drives people to click through to your site.

    With SEO and with all these updates the probability of you showing at the top all the time will diminish over time. Remember you are not the only one performing A/B tests. Google is more than likely performing their own version of it.

  8. Jon
    April 10, 2017 at 8:02 am · Reply

    I read a very insightful comment a while ago on a site that said that A/B is hype if you are not dealing with a large dataset. Having personally done a large number of A/B tests throughout a good few years of my career I believe that the juice is in large numbers not smaller ones. If you have 100 people visit your site you cannot make much inference from these numbers.

    If you have the information that a user wants they will most likley end up coming to your site , use that information and leave tracemarks of having found the site useful based on dwell time, page of exit.The only time I’ve seen a/b tests truly help was when it accidentally fixed some cross browser issue or moved a button within reach of a user.
    Most of the A/B website optimization industry is an elaborate scam, put off on people who don’t know any better and are looking for a magic bullet.

    Having said that dwell time and what a user does with the information is very critical for Google to determine if they want to punish you or not.

  9. Ashish
    April 26, 2017 at 8:59 am · Reply

    After going through the following we were able to note some improvement on our ranking more specifically around 24 April

    – Remove low quality content
    – Rewriting some components of homepage
    – Improving internal searches for head queries
    – Improving top real estate on all our pages
    – Removing all forms of ads even those hosted through third party and not just adsense.
    – Removing the signup popups

    It seems that there is a massive jump on all our rankings.
    I cannot pinpoint to some specific event that might have caused this but Google has re-crawled our entire site 6-7 times on 24th.

    Thank you for all the pointers and let this comment serve as a guide for all others reading this.

  10. David
    May 4, 2017 at 8:47 am · Reply

    Thank you for this wonderful list of questions to ask to mitigate the Fred recovery.We were one of the sites that were just impacted by Fred although our traffic loss only accounted for approx 9%. We believe that we were not hit by so called Fred update because our site is very clean and we are very savvy on SEO.

    We have observed some substantial amount of jumps in our rankings by doing the following since the update

    – DISAVOW our link profile. ( Can’t emphasise this enough )
    – Contact webmasters that had these links and ask them to delete the links or add a no-follow tag.
    – Observing the access logs and figuring out that Google is abandoning some depth pages or hasn’t visited them in over 3 months. Improving access to these pages and improving internal linking.

Leave a reply

Your email address will not be published. Required fields are marked *