You have verified your website in Google Search Console, and you want to run a website audit with the Ahrefs Site Audit Tool, but your audit keeps failing.
It might have worked the first time, but now when you try to audit your website, you get the same error:
I had this issue with my website, each time I tried an audit, I got this message:
Let’s take a look at some troubleshooting to fix this issue.
Jump to fix:
There are a few steps we can look at to avoid this issue, and make it that the Ahrefs Site Audit Tool can succesfully audit your website.
Update your Robots.txt file
You might need to update your Robots.txt file. This is the file that tells website crawlers, such as Google Bot, what can be crawled on your website.
Check your Robots.txt file to see if you are allowing website crawlers access to your website.
A normal Robots.txt file looks like this:
Add the Ahrefs Crawler to your Robots.txt file
1 – Go to your Robots.txt file
If you have a WordPress website, you can find this through the Yoast Plugin. Go to Yoast SEO, click on Tools, then click on File editor. You will now see your Robots.txt file at the top.
If you don’t have WordPress, you might need to ask your web developer for help. If you use a website builder, like Wix or SquareSpace, you can find the Robots.txt file in the admin section of you website.
A Robots.txt file should look something like this: (WordPress website)
* on the
User-agent: means that any crawler has access to this website – to be sure, we can add the Ahrefs Crawler.
Disallow: section can list pages and links that you don’t want a crawler to look at. For example, if you have a hidden page, or pages that you do not want to appear on a search engine, you can add them with Disallow: – one row at a at time.
2 – Add the Ahrefs Crawler
This is the Ahrefs Crawler:
To add it to your Robots.txt, add a new line, and add the code as above:
3 – Save changes and try your Audit again
Save the changes to your Robots.txt file and refresh the Ahrefs Site Audit Tool – now try your audit again.
Still no luck? Try the next solution.
Test your Robots.txt file can be crawled by Ahrefs
Ahrefs have a useful tool that you can use to check if they can crawl your website.
Go to ahrefs.com/robot/site-audit and enter your full website URL. This will tell you if your website can be crawled.
Still getting the ‘This website can’t be crawled.’ error?
Try the next solution.
Whitelist the Ahrefs IP Address Range
Ahrefs uses an ‘IP adress range’ – this means that they use different IP addresses to crawl your website.
In some cases, your hosting provider my have blocked these IP addresses, meaning that the Ahrefs Crawler can’t audit your website.
You will need to send a support ticket, or contact your hosting provider and send them the IP range that Ahrefs use.
Ahrefs IP Address Range:
These might be subject to change, so please refer to this link to see the most updated information – https://help.ahrefs.com/en/articles/78658-what-is-the-list-of-your-ip-ranges
Error 406 Not Acceptable: Firewall
If you keep getting an error message that says ‘Error 406 Not Acceptable: Firewall’, it could be an issue with the configuration of your webserver, the firewall managed by your hosting provider, or the protection of your CDN.
There is nothing that Ahrefs can do to fix this, as you will have to contact your hosting provider.
Here is a template you can use to email your hosting provider:
I own the domain <your full website address here> and I would like to use the Ahrefs Site Audit Tool. It seems that your firewall is blocking the Ahrefs Bot from crawling my website. Can you please unblock the crawler?
Please see more details about the crawler here: https://ahrefs.com/robot
Thank you very much.
Bot Type: Good (Identifies itself, has an official moniker)
Obeys Robots.txt: Yes
Obeys Crawl Delay: Yes
User-Agent String: Mozilla/5.0 (compatible; AhrefsBot/7.0; +http://ahrefs.com/robot/)
Reverse DNS suffix: ahrefs.com
Details updated as of April, 2021.
Did that fix your issue? I hope so. There are no other troubleshooting solutions from Ahrefs at the moment, but it seems that Whitelisting the IP Address Range is the most common fix.
Let me know how you get on, and share this troubleshooting guide with your friends and colleagues ⬇