Table Of Contents
- 1 Why did Google Deindex your site?
- 2 Did You Accidentally Click NoIndex?
- 3 Do You Have Crazy Links?
- 4 Do You Have Low-Quality Text?
- 5 Are You Giving the Appearance of Cloaking?
- 6 Have You Followed Guidelines for Structured Data?
- 7 Is Your Page Malignant?
- 8 Are You Using Doorway Pages?
- 9 8. Are You Relying on Free Hosting?
In many ways, it is a nightmare, one that many website owners have lived while awake. You check your traffic one day to find that organic traffic has fallen to almost zero. You try searching for your website, and there are no relevant results. What happened? In many cases, a site with a sudden crash in traffic and ranking has been deindexed by Google.
Although this can be devastating, there is a path back to your former standing. Finding out why you were deindexed and fixing the critical issue is key.
Why did Google Deindex your site?
There are two ways for a website to be deindexed: from a manual action or bots picking up red flags in your code. If you were deindexed manually, you would have a notification in your console telling you why.
However, in most cases, websites are deindexed due to something the web crawlers found. These infractions are more challenging to identify because there will be no notification or explanation. It is up to you to figure out what is wrong and fix it.
To many webmasters, this can feel like finding the proverbial needle in a haystack. The good news is that Google has a small list of issues that will lead to being deindexed, most of which are easily fixed. Combing your sites common issues is the first step to getting your internet standing back on track. Getting a mobile-first site is important but don’t make them a completely different experience.
Did You Accidentally Click NoIndex?
Many people temporarily turn off indexing so they can work on their website and then forget to turn it back on. There is also a possibility that you accidentally deindexed yourself. Deindexing yourself is done using the deindex directive.
This method may seem hopelessly simplistic, but it should be the first place you look when your site is deindexed. It is an incredibly easy issue to fix.
Do You Have Crazy Links?
Most people know that links, both incoming and outgoing, are essential to search engine optimisation. However, certain kinds of links can red flag Google’s crawlers and lead to being deindexed. Red flag links include the following:
- Adding links too quickly. Google sees this as evidence of link-sharing, link-buying, and other discouraged actions.
- Hiding links. Linking to other sites using white text on a white background (or similar) will get you black-listed. Search your code for hidden links if you aren’t sure whether these exist.
- Participating in link farms: Google knows about these and will find members eventually.
- Low-quality inbound links. If sites identified as low quality have linked to you, your standing, unfortunately, will be diminished. These can be found by doing an audit on your backlink profile.
Do You Have Low-Quality Text?
Google’ crawlers are getting increasingly sophisticated at determining when text is low quality. If you have spammy comments, duplicate wording, or similar issues, your site may be deindexed.
However, there is a vast grey area in which you can be deindexed even while trying to produce high-quality content. Spelling and grammar errors can be a red flag, so you should ensure that your content is proofread thoroughly. Additionally, put your content through a plagiarism checker to ensure you haven’t inadvertently used the same wording as other sites. In an age of aggressive SEO, there is a decent chance of accidentally word your text similarly to your competitors.
Low-quality text should be removed immediately. Make sure your writing is adding value and worth that page rank. Even if you manage to get some low quality work past the bots, your visitors are sure to notice.
Are You Giving the Appearance of Cloaking?
The practice known as cloaking is a huge red flag for Google. In this black hat technique, websites give users and search engines two different sets of pages and URLs. In doing so, allowing people to optimise without providing relevant content.
Although this is usually done on purpose, there are a few ways to make your website appear cloaked accidentally. If you have material behind a paywall, for example, it will often look like cloaking to bots. Google gives explicit instructions for handling this in a search engine friendly manner using JSON-LD.
The other leading cause is that your website has been hacked and is redirecting visitors to another site with spam. A simple website scan will help you diagnose and treat this problem.
Have You Followed Guidelines for Structured Data?
Google has strict guidelines for structured data. Failing to follow these rules can be a red flag, especially if you have one of the following issues:
- Your structured data reduces the user experience
- It is misleading or unrelated to other content
- The content that it refers to is not visible to users, such as when another image hides images
- Google’s instructions have not been followed to the word
If you want to use structured data, see Google’s instructions and do not deviate from them even in the smallest ways.
Is Your Page Malignant?
It is time, to be honest with yourself. Have you been using misleading techniques to woo in more traffic? Sneaky redirects, keyword stuffing, and even phishing all can seem attractive to someone who needs more views. However, this is as bad for your website’s future as it is for your users.
If you are having trouble getting the views you desire, look into legitimate ways to optimise. High-quality SEO may take a while to perfect, but it generally won’t lead to being deindexed.
Are You Using Doorway Pages?
Once considered a legitimate technique, doorway pages will now get you blacklisted. These pages are ones designed only to increase rank for given keywords, without performing any other function on the website. They usually link to another, more useful page.
In another variation, some pages are built to rank for very vague and general search terms despite having more specific content. These “umbrella pages” link to other parts of the site as well.
Google’s main issue with these pages is that they ruin the all-important user experience by adding to the number of clicks a person has to make to find the information they are seeking. They are not above using this tactic themselves, which has led to some protest.
8. Are You Relying on Free Hosting?
Free hosting may seem like such a great deal. After all, there is nothing cheaper than zero. Many new website owners are working with a tight or nonexistent budget, making these services seem quite attractive. However, they may come at a severe cost.
Free hosting websites often fill their pages with spammy ads and other features that are not user-friendly. As we all know, free hosting companies usually inject code/ads to cover pages with ads. As a result, unwitting webmasters using a free host may be deindexed due to code they did not write and were not even aware if,.of.
If this is the root issue behind your drop in SEO rank, there is only one way to rectify it: get a legitimate (paid!) web host. Google is threatening to take action against free hosting sites in general, so it is just a matter of time before a lot of website owners are wondering why their organic traffic tanked.
Being deindexed can be a blow to your website, your income, and even your ego. However, most cases are due to one of the issues listed here. Finding and fixing your problem and then resubmitting to Google is the first step to rebuilding your reputation. Within a few days, you will be back on the path to higher search engine results standing and the resulting higher traffic.