Perform a technical SEO sanity audit

Contributors

@belovinovgmail-com


Business Benefits

Find and fix SEO issues to improve discoverability.


Run your website through a crawler to find errors. Prioritize warnings and errors in your list of needed fixes.

Use a website crawler like Screaming Frog to emulate what a web crawler like Googlebot would see. Analyze the results on what to fix first. Issues are generally in one of 3 categories: errors, warnings, and issues, the worst being warnings. Examples:

  • Errors: Important pages that are non-indexable or have crawling issues.
  • Warnings: Pages with a low word count or fixing temporary redirects.
  • Notices: Pages with low internal link count or pages with multiple H1 tags.

Use the search operator site:domain.com to see how many pages are currently in Google’s index.

This will show which pages are in Google’s index and give you an idea of the breadth of pages Google and other search engines know about. You can also use the Coverage section under the Index report in Google Search Console to see the pages and their given status, for example, error, valid, and excluded.

Add the -inurl:https operator to look for insecure pages in Google’s index.

Use a search operator to see which pages are insecure in Google’s index: site:www.verblio.com -inurl:https

Make sure these redirect to secure pages. Some content management systems like WordPress will have intermittent http pages. Work with a developer to remove those, and chain redirects completely. These slow down crawlers and are bad for your site health.

Review your sitemap to ensure that it is up to date, has only 200 status code, indexable pages, and is submitted to Google Search Console.

XML Sitemaps are the most popular format and should be automated, not something you need to upload or update on a regular cadence. XML Sitemaps should only have 200 status code and indexable pages. Sitemaps should never be larger than 50k pages or 50 MB in size. If you have more than 50k pages, create a sitemap index and have child sitemaps under the parent index. Sitemap or sitemap index should live at the root of the site, and be referenced in the robots.txt file as well.

Sitemap root: www.domain.com/sitemap.xml

Robots.txt sitemap call out: sitemap: www.domain.com/sitemap.xml

Find 404 errors in Google Search Console using the Crawl Report.

Audit the total number of requests grouped by response code, crawled file type, crawl purpose, and Googlebot type. Always check the detailed information on host status. Audit the URL examples to show where in your site requests occurred. Audit the summary for properties with multiple hosts and support for domain properties.

Setup a 404 alerting system and make a custom 404 page for your site.

Use Google Analytics to set up a 404 custom report. Go to any 404 page on your site and look at the page title. Create a filter in Google Analytics that shows the previous page path before users saw the 404 page title.

Make sure you have a custom 404 page on your site showing things like top pages and a search box, so that people can find what they are looking for. Never redirect your 404 pages.

Perform a backlink audit using Ahrefs, SEM Rush, or Majestic.

For example, you can run anchor text analysis, competitors’ analysis, and checking for spammy links. Any of the mentioned tools will help you figure out the general authority like domain rating and authority, and traffic a site gets.

Majestic has a trust flow metric that is helpful as it aggregates multiple metrics into one number. Take the links you’ve identified as clear spam and disavow them with Google’s disavow tool.

Test your site speed with a speed checker tool like Google PageSpeed or Web Page Test.

The faster, the better. Site load time should not exceed three seconds. Speed is affected by large file sizes, JavaScript issues, website responsiveness, or too many redirects. Core Web Vitals are an important aspect of site speed and user experience, and combines first contentful paint (FCP), cumulative layout shift (CLS), and first input delay (FID).

Give technical SEO audit notes to web developers, or someone with the skillset to fix issues.

Give the warnings, the most important issues, to developers to fix any pages with indexing or crawling issues. Annotate when something was fixed, using Google Analytics is recommended, to keep a log of what has been done, so analyzing becomes easier on the ROI.

Last edited by @hesh_fekry 2023-11-14T11:35:41Z

1 Like

Indeed, updated step.

  • A backlink audit covers more aspects, like for example anchor text analysis, competitors analysis, not only checking for spammy links. It would be less confusing renaming this point as “7. Perform a backlinks profile health check using Ahrefs, SEM Rush, or Majestic”.
  • The definition of spammy links is not very acurate, maybe it should be removed.