SEO Audit To Audit Your Site’s Organic Rankings

Table of Contents

An SEO audit is the process of finding opportunities to improve a site’s organic rankings.

You can perform an SEO audit when you launch the site or regularly to impact your overall rankings.

Everyone differs in how they do an SEO audit. There is no universally acceptable way as Google’s search algorithm is not transparent.

SEO audits tend to cover technical aspects so SEO audit guides need to dwelve on improving a site.

You should start out by first reading our comprehensive guide to SEO to ensure that you have the basics right.

Both your On-page SEO and your technical SEO needs to be on point for an SEO audit to be effective.

Let’s dive into the specifics of an SEO audit.

Mobile First or Mobile Friendliness

Google in their numerous studies about mobile devices has indicated that mobile is the future.

Infact, Google has started using Googlebot Smartphone as the indexing crawlers for most websites.

You can verify this by logging into your search console and heading to Settings.

With mobile being the priority for Google, it makes sense to check your mobile friendliness and usability.

 Pro Tip

Head to Search Console > Mobile Usability to check any mobile usability issues.

Mobile usability

Audit Mobile usability

Google will complain in the event if it encounters any of the following

  • Uses incompatible plugins
  • Viewport not set
  • Viewport not set to “device-width”
  • Content wider than screen
  • Text too small to read
  • Clickable elements too close together

Since March 2018, Google has been rolling out a mobile first algorithm.

You can also use the Google Mobile-Friendly test to check your mobile usability.

 Pro Tip

In addition to the mobile friendliness, you should also focus on mobile loading speed as a central part of your SEO audit.

  • Use the same meta directives on all versions of your site.
  • Don’t lazy load your primary content if users are likely to interact with it in the fold.
  • Don’t block any resource from Google including scripts or CSS files.
  • Double check all your structured data and make sure that all versions of your site have the same structured snippet.
  • Use high quality images and optimize them before uploading

Check Robots.txt

Consider the Robots.txt as a file that you should audit often.

Robots.txt is a request to Google and other search engines to politely crawl or not crawl your site.

Crawlers may or may not choose to follow these requests.

It is still recommended that you have a Robots.txt in place because major search engines do respect the file.

The Robots.txt is usually found in the root directory of the website.

You can view any websites Robots.txt by going to their root URL followed by the robots.txt

In our case the file is at

https://mojodojo.io/robots.txt

Robots.txt can help maximize your crawl budget. This means that Google will get to important pages faster and often. You don’t want Google’s crawlers to be stuck on some loop on search pages or to irrelevant pages on your site.

A little error in Robots.txt can lead to a disappearance of your site in search engines.

Google will remove you from their index if you block your entire site and it is not very difficult to do so.

So exercise caution when editing your robots.txt file. Remember different crawlers interpret robots.txt syntax differently. Every SEO audit should including looking at your robots.txt again.

It’s 2022 and SSL certificates are free to install.

Letsencrypt caters to almost all types of web servers.

In Fact, web servers like Caddy have a built-in SSL deployment.

So it goes without saying that you should have an SSL certificate installed on your server.

This means your website will load from HTTPS versions.

You may have certain parts of the webpage like images loaded from a http version of the page.

This will result in mixed content or insecure URLs.

To adhere to the HTTPS standard, you need all resources loaded from the HTTPS part of the URL.

It happens because sometimes developers hardcode images or parts of the web page in the code iteslf.

The best way to find and audit these errors is to use Screaming Frog.

HTTP issues in SEO audit

Once you have run the crawl, simply look at the mixed content on the sidebar of the Screaming Frog crawl.

Look at all the URLs that have the issue and troubleshoot to find the offender.

Also look at the HTTP URLs section of the crawl and fix all URLs using HTTP version of the site.

WWW VS Non WWW versions

Google crawls the web aggressively. This means it may even have crawled that PDF that you did not want it to crawl. Or perhaps the subdomain…

Google even treats subdomains as individual entities.

It even treats the non https version of your site as a separate entity.

Each URL is like a primary key in the database.

This means your website will normally have 4 versions (entities)

Each URL will have its own score and thus despite them all being the same , your authority will dilute.

Many websites including those of SEO agencies have problems with resolving subdomains incorreclty.

The best way to validate this is to simply type in each version of your site into the URL.

If your browser redirects all versions to the same final version of the site, you are done.

Alternatively, you can use a site like Httpstatus.io to check.

You can even use the network tab of your dev tools to see the status code.

You want to use a 301 redirect indicating to Google that the redirect is permanant.

Developers have a weak grasp on Regex rules and they often use regex to write redirects. It may thus be wise to not only check your homepage but to check an internal page too.

So ideally, you would also check an internal page like so

  • http://mojodojo.io/seo/
  • https://mojodojo.io/seo/
  • http://www.mojodojo.io/seo/
  • https://www.mojodojo.io/seo/

Make sure they all redirect to the same version of the URL.

Audit indexability report

The indexability report is available in your search console.

Simply head to Search Console > Your domain > Pages

indexability report

You can see from this report that Google tells you that your website has a number of issues namely

  • Pages not found / 404 errors
  • Pages with redirects
  • Duplicate without user-selected canonical
  • Soft 404s
  • Duplicate. Google chose different canonical than user
  • Crawled but currently not indexed

You should fix all the issues on this page including any soft 404s.

This is perhaps the most time consuming part of any SEO audit.

If you have specifically marked certain parts of your website as “noindex”, you can then ignore those parts.

Google indexability problems are generally indicative of other issues like server configs or CMS configs.

Once you have fixed these issues, it is best to let Google validate your fixes.

You can do this by telling Google to validate the fix by clicking through on the issue link.

validate fix search console

Audit On-page SEO

On-page SEO is key to making sure that you have optimized your site for ranking on search engines.

 Pro Tip

The basics of On-Page SEO are

  • Target one page for one primary keyword
  • Keyword research before building your service pages
  • Focus on writing great content
  • Optimize your meta titles/snippets/URLs/site architecture
  • Ensure you have internal links and they are relevant
  • Look at your anchor texts and be precise
  • Fix duplicate content
  • Check your technical SEO like robots.txt, sitemap.xml, crawlability, indexability report.
  • Have a good content strategy
  • Improve your text readability
  • Improve your media and images to load faster
  • Update content often as content recency matters.
  • Get an SSL certificate
  • Improve your page speed

Improve Content

The best way to find content to improve is to look at declining content.

Content may decline if it is seasonal or is implied seasonal.

An SEO audit should focus on a long term plan to address declining content.

 Pro Tip

You can quickly audit declining content from Google search console.

  • In the search console, head to the Performance report
  • Select the dates to be the last 6 months and do a comparison to the previous 6 months. This is because Google only stores data for 16 months in the search console.
  • Click the pages tab to see all pages
  • Sort the table by clicks.

Look for content that is declining.

Declining content usually refers to competing pages having usurped you in rankings.

Start by doing a simple Google search and see why other pages are performing better.

Sometimes, Google also changes the keyword specificity of a particular search term.

It reclassifies certain keywords or topics from commercial intent to informational intent or vice versa.

It might be a good idea to amend the page.

Remember, both the content and the history of the URL are instrumental in rankings.

Don’t simply discard an existing URL with a rich history in favor of a new one.

Improve and update your content.

Check Page speed

The best way to check pagespeed of each page on your site is to check the Google Analytics page speed report.

You can get this report by browsing to Google Analytics > Behavior > Site Speed > Page Timings

This report is not available in GA4.

You can also use Screaming Frog to connect page speed insights to get individual score.

Alternatively, you can check the response time report in Screaming Frog.

page load time

Use this report to find the pages that load the slowest.

Use pagespeed insights or Lighthouse to find why they load slow and fix the page speed.

 Pro Tip

You could also use the chrome dev tools to audit the unused resources.

  1. Open your webpage in Chrome
  2. Hit F12 and open devtools
  3. Ctrl + shift + P
  4. Type coverage and hit show coverage
  5. Hit the round grey record button
  6. The page will draw up the usage visualisation for all the resources used (CSS/JS)
  7. You can then click on a resource and the editor will highlight between blue/red for what is and isn’t used

Chrome dev tools coverage audit

Work on optimising the most used CSS / JS files.

Check Core Web Vitals

Core web vitals are metrics introduced by Google to check how the page performs for user experience.

The core web vitals is a measure of page’s load time, its elements and relative shifts in page stability and its interactivity.

Here is a report for core web vitals from Google search console

Core web vitals

You can also look at the page experience report in search console.

page experience

A further breakdown of what’s being measured is available by clicking on the link.

LCP, FID and CLS

 Pro Tip

It measures

  • Largest Contentful Paint (LCP) – measures how fast the page loads. Ideal loading speed is 2.5 seconds or less.
  • First Input Delay (FID) – measures interactivity. Ideal value is 100 milliseconds or less.
  • Cumulative Layout Shift (CLS) – measures visual stability. Ideal CLS is 0.1 or less. If you have dynamic javascripts or Google ads running on the site, you will have this out of whack.

Audit duplicate content

You can end up with duplicate content in a number of ways. The most common one being different URLs serving the same content.

This usually happens with CMSes like WordPress, Shopify, Magento and even Prestashop.

You can use the noindex directives on specific pages or even entire taxonomies.

The best way to detect duplicate or near duplicate pages is to use Screaming Frog to conduct an audit.

After you have crawled the site, simply head to the sidebar to find duplicate pages.

duplicate content

Screaming Frog detects exact duplicates and near duplicates.

It detects exact duplicates by comparing the MD5 hashes of all pages with each other.

With the near duplicates detection, it uses a 90% match using the minhash algorithm.

Both of these are extremely useful to detect content issues with your site.

If you run an ecommerce website, be careful with both near and exact duplicates. You may have duplicates detected for

  • Product pages with attributes
  • Category pages accessible as via different URL
  • Product pages accessible via page type like single page, grouped product, complex product or bundled product

Broken links or redirects lead to poor user experience.

With that said, the best way to transfer an existing page’s authority to a new page is a 301 redirect.

 Pro Tip

Use redirects if:

  • You have moved the domain or the page and want to pass the authority
  • People can access your website through several different versions of the URL see WWW vs non WWW above
  • You are merging websites
  • You removed a page and instead of a 404 page not found you want them to go to a more relevant page.

You can use permanent redirects if the page has moved permanently. These include 301 , 308, meta refresh, HTTP refresh and javascript location

Use temporary redirects when the redirect may be removed in the future. These include 302, 303, 307, meta refresh and http refresh.

Although, I would recommend avoiding both meta refresh and HTTP refresh.

Check Canonicalization

When you have a single page accessible from multiple URls, Google sees them as duplicates. In the absence of canonical instructions, Google will choose a single URL for all the pages.

Google will also see different URLs with similar content as duplicates. In those cases too, Google will either respect your canonical directive or choose one itself.

Canonicals help deal with crawl budgets.

If you have multiple versions of the same page accessible from different URLs, you want Google to crawl one version regularly.

You would also want Google to ignore the other versions to help Google find the most relevant content faster and often.

You can use a rel=canonical tag in your header to tell Google which URL is the canonical page.

Consider that different language versions are also considered duplicates and thus must be canonicalized.

Use the canonical section of the Screaming Frog audit to check for issues with canonicals.

canonicals in seo audit via screaming frog

Self referencing is the method of pointing to the page itself as the canonical version.

It is especially useful because while Google does a great job of finding canonicals, parameters in URLs may be unique to your domain and confusing.

It is always best practice to self reference canonicals.

Google’s John Mueller recently stated : “It’s a great practice to have a self-referencing canonical but it’s not critical.”

Consider Wikipedia’s rankings on Google.

Wikipedia ranks for most search terms on Google because of the large volume of content and great internal links.

Internal links are a great way for you to tell Google what you think the page is about.

The words that you use to create the link is called an anchor text.

Make sure you use the right anchor text to link to relevant content.

You can do an internal link audit via Screaming Frog.

Internal Links

Choose a URL in Screaming Frog and then choose the Inlinks tab at the bottom to look at all incoming internal links.

Make sure you look at the anchor text section of the Inlinks to see what anchor text you are using to link your content.

With anchor text, be specific.

Google Penalties

Sometimes human reviewers at Google decide that your website is not compliant with webmasters guidelines.

They may also decide that the quality of your top ranked page is lower and you are using other methods to push the page higher in search results.

In such cases, they may decide to levy a penalty on your site.

You are unlikely to have a manual penalty unless you engage in black hat SEO or spamming the SERPs.

You can check if you have been hit with a manual penalty by simply going to the search console and looking at the manual actions tab.

Manual actions Google penalty

Share:

SEO Playbook
Only free for a short period of time.
Name(Required)
This field is for validation purposes and should be left unchanged.

Related Posts

E-commerce SEO

Over the past decade, e-commerce has been growing at an incredible rate. This is thanks to many things. Such as

19 Channels of Traction

You’ve created a brand and are ready to start selling your products or services. The next step is to find