Once you run your Site Audit, you can open your Project and see your results in the Overview report. You’ll see an estimate of your website’s on-page health, pointing out all the issues you should be aware of. From here you can jump into reports analyzing your site's robots.txt, crawlability, HTTPS implementation, international SEO, performance (page load speeds), and internal linking.
Your Total Score (a percentage between 0% and 100%) takes into account the number of errors and warnings found during the crawl in relation to the number of performed checks.
A higher Total Score is a good thing, resembling a lower density of problems on your website. Errors will have a higher impact on your Total Score than warnings, but both should be addressed to improve your site’s health.
Alongside the total score will be a breakdown of your site’s crawled pages showing exactly how many pages were found healthy, broken, having issues, redirects, or blocked our crawler.
To the right, you will notice the total number of Errors, Warnings, and Notices on the site.
Errors will be shown in red and are the most severe issues on your site.
Warnings will be shown in orange and represent issues of medium severity.
Notices are in blue and are considered less severe than errors or warnings. Notices contain information that some people might consider useful for fixing a site, but they don’t impact your overall site health score.
Below the errors, warnings, and notices you will see the Top issues from your audit. These issues are shown based on their priority level and the number of pages with said issue. Click on one of the buttons beside the issue to see the list presented in the Issues tab.
To learn about the issues our crawler checks for, please refer to the manual article - What Issues Can Site Audit Identify?
Thematic reports dive into specific aspects of a website where common issues are found.
Crawlability refers to how easily search engine bots can crawl your website and find the information they’re looking for.
HTTPS will help you diagnose any issues related to a website’s transition from HTTP to HTTPS.
International SEO covers the best practices for global websites and checks a domain’s use of the hreflang tag for common misuses and errors.
Performance evaluates a site’s page load speed, file minification and other issues related to a website’s speed and performance.
Internal linking looks at how well a site links between its pages. If you have any “orphaned” pages or pages with high click depth, they are often hard for your visitors to find. This report will tell you what those pages are so you can address the issue and improve your internal linking strategy.
For more information on reading each of these thematic reports, read this page of the user manual: Site Audit Thematic Reports.
Re-running Site Audit
After you check all of your issues and made the adjustments on your site to fix them, you can re-run the Site Audit (1) or export your results as (2) a PDF report, or (3) an XLS or CSV file.
When you re-run your audit, it will give you a new Total Score as well as change the number of issues found if there were significant changes made. As you re-run the campaign, you have the option to either stop the audit mid-crawl or stop and save results in the middle of the crawl.
If you stop and save results in the middle of a crawl, the audit will present the new results of it’s unfinished crawl as your up-to-date audit. This way, your crawl budget will be saved from overspending.
If you discard results, Site Audit will not crawl or audit your pages and your crawl budget will be saved.
When you export to PDF, you’ll have the option to email, schedule, or brand the report with a logo. You can also take it a step further and create a custom PDF report (using the My Reports feature) around your Site Audit, incorporating additional research reports and blocks of text to fully personalize and mold the report into an effective document.
As your audit updates itself over time, you can follow any increases or decreases in your site’s health from this window. Especially if you are making the recommended changes to your website, you’ll easily be able to monitor how much improvement happens.
A robots.txt file is used to instruct bots (such as search engine crawlers) what content on your website to crawl. This widget will tell you if SEMrush noticed any changes to your robots.txt file, and any issues related to the file that could impact the crawlability of your website.
Configuring a robots.txt is a technical job and you should follow Google’s guidelines as much as possible. Click on the blue link to open a filtered Issues report showing you the problems with your file and how to fix them. Click on the pop-out icon to open your site’s robots.txt.