SEO Audits

Basic Guide To SEO Audits

20th April 2018no responsesSEO

When I started working in SEO, I had never done a SEO audit. I understood the principles of what a SEO audit was but I never got the opportunity to do one myself. My first SEO Audit was a long and tedious process as I had to get to grips with all the tools available to me. Now, I am in no way insinuating it is an easy task, because regardless of how often you have done it before, it is not easy. But if you know what to check for on a website and more importantly know which tools are available you can use, it will make your SEO audit a lot easier.

While the algorithms for the Search Engines are ever changing, the basics of SEO have generally remained the same. Google provides plenty free tools that are an excellent starting point and more often than not, are sufficient for a decent SEO audit.

Crawl, Read and Index

First things first, you need to make sure Google can crawl, read and index your pages. Google makes this really easy for you. In Google Webmaster Tools > Site Configurations > Sitemaps you will see a list of sitemaps you have submitted for your site. If you don’t, you may have not submitted a sitemap yet.

Here in Sitemap you will be able to see if your website has any indexing problems – check the number of “URL’s submitted” compared to “URL’s in Web index”. These should be the same, but they won’t be as Google only checks the pages that are listed in the sitemap and many pages will be blocked by the robots.txt. Another reason will be duplicate URL’s in the sitemap, as the index will only ever show unique URL’s.

Another thing to check is whether the pages are getting crawled by simply checking the “Cache” of a website. The easiest way to do this, is to simply type “cache:”. At the top it will show you when Google last crawled a website. You want the Search Engines to crawl a website frequently, so keep it updated.

While you are in the cached version of the website check if the content is visible to the Search Engines – click on “text-only-version”, which disables JavaScript, CSS and Cookies. Compare the text version (how the search engines see the page) to the user version. If the website is displaying different data to the search engines than it is to the users, it is called “cloaking” and can impose penalties to the website.


A robots.txt can help you to block search engines crawling certain pages or folders of a website. If used incorrectly, a robots.txt can cause all sorts of problems. If you look at the cache of a page and there is no cache or it hasn’t been crawled for a long time, check the robots.txt to see if it has been blocked.

Again Webmaster Tools make it really easy see what pages or folder are being blocked. Select your website and go to Site Configurations > Crawler Access. It should be fairly easy to see if the robots.txt is blocking any pages or folders that shouldn’t be blocked from the search engines. Find out more about robots.txt here.


Canonicalisation happens when multiple URLs share the same content, similar to duplicate content. Examples are:


All these pages will most likely show the same content and hence can be linked to from different external websites.

This is a big deal, as it devalues the website and all the links that the one website would receive are spread across all these different URL’s and therefore manipulate the rankings. Checking your canonicalisation is crucial; otherwise you might end up with identical pages competing with each other.

Identifying this problem is also pretty easy: Just copy a unique sentence from the content, paste it into Google inside quotes. Ideally only one page should show up, the one you took the sentence from. If you see more results however, than you might have duplicate content issues. In this case check the URL and see if it caused by a canonicalisation issues!

Check Redirects/Crawl Errors

Webmaster Tools can be helpful here too, by checking what errors Google receives from crawling the website. In Diagnostics > Crawl Errors you will find a list of the problems Google finds crawling your website. Expect to see 404’s, nofollow, redirects, timed outs, etc. You will also see if an external link is pointing to the wrong URL.

Check the redirected URLs response with an HTTP headers check.

Title Tags/H1 Tags

Should there be any issues with duplicate, missing or short title tags, H1 heading tags or meta description then Google Webmaster Tools can quickly help you! Simply go to Diagnostics > HTML Suggestions and you will find a list of the problems that Google has found.

If there are duplicate title tags then that might lead to “keyword cannibalisation”, which means that more than one page on your site are competing to rank for the same keyword. While you are checking for duplicate title tags, make sure the keyword or search term is before the brand name.

Author: Peter Wootton

Top Ranking SEO Expert & Consultant - Ranked Top On Google For "SEO Manchester". Specialist in Technical SEO.

No Comments on Basic Guide To SEO Audits

  1. Connor Gibson
    January 14, 2010Reply

    Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum. Duis mollis, est non commodo luctus, nisi erat porttitor ligula, eget lacinia odio sem nec elit. Sed posuere consectetur est at lobortis. Integer posuere erat a ante venenatis dapibus posuere velit aliquet. Vestibulum id ligula porta felis euismod semper.

  2. Nikolas Brooten
    February 21, 2010Reply

    Quisque tristique tincidunt metus non aliquam. Quisque ac risus sit amet quam sollicitudin vestibulum vitae malesuada libero. Mauris magna elit, suscipit non ornare et, blandit a tellus. Pellentesque dignissim ornare elit, quis mattis eros sodales ac.

    • Pearce Frye
      February 22, 2010Reply

      Cras mattis consectetur purus sit amet fermentum. Integer posuere erat a ante venenatis dapibus posuere velit aliquet. Etiam porta sem malesuada magna mollis euismod. Maecenas sed diam eget risus varius blandit non.

      • Nikolas Brooten
        April 4, 2010Reply

        Nullam id dolor id nibh ultricies vehicula ut id. Cras mattis consectetur purus sit amet fermentum. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum.

  3. Lou Bloxham
    May 03, 2010Reply

    Sed posuere consectetur est at lobortis. Vestibulum id ligula porta felis euismod semper. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus.

Would you like to share your thoughts?

Your email address will not be published. Required fields are marked *