I have used a lot of different search engine optimization (SEO) audit tools over the past two decades. That includes enterprise solutions like BrightEdge and Conductor to lifetime deals like like Ispionage, Serpstat, and Screpy. Hands down, the best tool I have used and continue using is the audit tool within the Semrush platform.
The following image by itself shows you how many things it is checking for which laps any competing solution. I also listed these items alphabetically below the image so you can more easily see and compare them to what you are using today.
I find it helpful to run this audit on the first day of each month, typically at night to avoid impacting client traffic. You can compare the results with the previous audit to quickly spot major changes and address anything that stands out. Throughout the month, I tackle errors, warnings, and notices as time allows, then reset the process for the following month. This approach has helped me continue reducing the overall issues while also keeping the errors close to zero.
GENERAL
Pages crawled
Site Health
Total errors
Total issues
Total notices
Total warnings
ERRORS
4xx errors
5xx errors
Broken canonical URLs
Broken internal images
Broken internal JavaScript and CSS files
Broken internal links
Certificate Expiration
Certificate registered to incorrect name
DNS resolution issue
Duplicate content
Duplicate meta descriptions
Duplicate title tags
Here is the text from the image, sorted alphabetically:
Hreflang conflicts with incorrect hreflang tags
Hreflang conflicts within page source code
Incorrect pages found in sitemap.xml
Insecure encryption algorithms
Invalid robots.txt format
Invalid sitemap.xml format
Invalid structured data items
Issues with hreflang values
Issues with incorrect hreflang links
Issues with mixed content
Large HTML page size
Malformed links
Meta refresh redirects
Missing canonical tags in AMP pages
Missing canonical URL nor 301 redirect from HTTP homepage
Multiple canonical URLs
Neither canonical URL nor 301 redirect from HTTP homepage
Non-secure pages
Old security protocol version
Pages not crawled
Redirect chains and loops
Slow page (HTML) load speed
Too large sitemap.xml
Viewport not configured
Viewport not set
We couldn’t open the page’s URL
www resolve issues
WARNINGS
Blocked internal resources in robots.txt
Broken external images
Broken external links
Broken internal links
Doctype not declared
Duplicate content in h1 and title
Encoding not declared
Frames used
HTTPS encryption not used
HTTPS URLs in sitemap.xml for HTTPS site
Links lead to HTTP pages for HTTPS site
Long title element
Low text to HTML ratio
Low word count
Missing ALT attributes
Missing h1
Missing hreflang and lang attributes
Missing meta description
No SNI support
No SSL support
Nofollow attributes in outgoing internal links
Short title element
Sitemap.xml not found
Sitemap.xml not specified in robots.txt
Temporary redirects
Too large JavaScript and CSS total size
Too long JavaScript and CSS files
Too long title URLs
Too many JavaScript and CSS files
Too many on-page links
Too many URL parameters
Uncached JavaScript and CSS files
Uncompressed JavaScript and CSS files
Uncompressed pages
Underscores in URL
Unminified JavaScript and CSS files
NOTICES
Blocked by X-Robots-Tag: noindex HTTP header
Blocked external resources in robots.txt
Blocked from crawling
Broken external JavaScript and CSS files
External pages or resources with 403 HTTP status code
Hreflang language mismatch issues
Links with descriptive anchor text
Links with no anchor text
Multiple h1 tags
No HSTS support
Nofollow attributes in outgoing external links
Orphaned pages (Google Analytics)
Orphaned sitemap pages
Page Crawl Depth more than 3 clicks
Pages with only one internal link
Permanent redirects
Resources formatted as page links
Robots.txt not found
URLs longer than 200 characters