Regular SEO audits are crucial for a healthy website that ranks high on the SERPs. That said, an SEO audit does require resources. Without a well-built audit algorithm, you are at risk of wasting too much time on unnecessary things.
That’s why we’ve come up with a detailed guide on performing an SEO audit quickly and easily. Save it, share it with colleagues, and enjoy your SEO workflow.
Let’s get down to business.
1. Cover indexing issues
Your website may be full of great content and perfectly interlinked. None of it matters if search engines cannot see and index it. An unindexed website does not appear in SERPs, so users cannot find it in organic results. This means no organic traffic, no user interactions, and no target actions. This is definitely not a business goal you have.
To check if Google (and other search engines) can access and index your site, go to Google Search Console. This free set of webmaster tools lets you check and fix site issues, measure traffic and performance, and easily deal with most of the problems a website may face.
Note: If you haven’t connected your website to GSC, it’s high time you do. Here are detailed instructions right from Google.
In Search Console, go to Index > Coverage report. Pay attention to the Error and Valid with warning sections.
Scroll down the Coverage report to the Details section to see the exact errors and warnings detected on your website:
To see the pages affected by an error, click on the issue line.
Now look at the errors and warnings and see how to fix them.
Errors in the indexation process mean that Google tried to index the page but failed for some reason. And the first thing is to check whether you actually want a URL to be indexed or not. If not, then just tell Google to stop trying — exclude the page from your sitemap and block it with a noindex tag.
If you’re sure that the page needs to be indexed, then fix the issue:
- Submitted URL marked ‘noindex’. Remove the noindex tag from the page’s HTML code or delete a noindex header from the HTTP request.
- Submitted URL blocked by robots.txt. Update the robots.txt
This article was written by Jamil Ali Ahmed and originally published on The Official Cloudways Blog.