Why did I get a “page is not accessible” note for some of my pages? Question
This message will appear if the On Page SEO Checker crawler was blocked or unable to crawl your page. To fix this issue, refer to the robots.txt file on your website to make sure that is allows user agents to crawl its pages.
The name of our robot to whitelist is SemrushBot-SI. To whitelist our crawler, please whitelist the following IP Addresses:
To specify the Port, use one of the following options:
Port 80: HTTP
Port 443: HTTPS
Additionally, you should also whitelist the Site Audit IP which is used to crawl pages at the following IP address: 18.104.22.168