You have verified your website in Google Search Console, and you want to run a website audit with the Ahrefs Site Audit Tool, but your audit keeps failing.

It might have worked the first time, but now when you try to audit your website, you get the same error:

I had this issue with my website, each time I tried an audit, I got this message:

Ahrefs can't crawl my website

Let’s take a look at some troubleshooting to fix this issue.


There are a few steps we can look at to avoid this issue, and make it that the Ahrefs Site Audit Tool can succesfully audit your website.

Update your Robots.txt file

You might need to update your Robots.txt file. This is the file that tells website crawlers, such as Google Bot, what can be crawled on your website.

Check your Robots.txt file to see if you are allowing website crawlers access to your website.

A normal Robots.txt file looks like this:

Add the Ahrefs Crawler to your Robots.txt file

1 – Go to your Robots.txt file

If you have a WordPress website, you can find this through the Yoast Plugin. Go to Yoast SEO, click on Tools, then click on File editor. You will now see your Robots.txt file at the top.

If you don’t have WordPress, you might need to ask your web developer for help. If you use a website builder, like Wix or SquareSpace, you can find the Robots.txt file in the admin section of you website.

A Robots.txt file should look something like this: (WordPress website)

The * on the User-agent: means that any crawler has access to this website – to be sure, we can add the Ahrefs Crawler.

The Disallow: section can list pages and links that you don’t want a crawler to look at. For example, if you have a hidden page, or pages that you do not want to appear on a search engine, you can add them with Disallow: – one row at a at time.

2 – Add the Ahrefs Crawler

This is the Ahrefs Crawler:

To add it to your Robots.txt, add a new line, and add the code as above:

3 – Save changes and try your Audit again

Save the changes to your Robots.txt file and refresh the Ahrefs Site Audit Tool – now try your audit again.

Still no luck? Try the next solution.

Test your Robots.txt file can be crawled by Ahrefs

Ahrefs have a useful tool that you can use to check if they can crawl your website.

Go to and enter your full website URL. This will tell you if your website can be crawled.

Ahrefs Robots.txt Tester

Still getting the ‘This website can’t be crawled.’ error?

This website can't be crawled error - Ahrefs

Try the next solution.

Whitelist the Ahrefs IP Address Range

Ahrefs uses an ‘IP adress range’ – this means that they use different IP addresses to crawl your website.

In some cases, your hosting provider my have blocked these IP addresses, meaning that the Ahrefs Crawler can’t audit your website.

You will need to send a support ticket, or contact your hosting provider and send them the IP range that Ahrefs use.

Ahrefs IP Address Range:

These might be subject to change, so please refer to this link to see the most updated information –

Error 406 Not Acceptable: Firewall

If you keep getting an error message that says ‘Error 406 Not Acceptable: Firewall’, it could be an issue with the configuration of your webserver, the firewall managed by your hosting provider, or the protection of your CDN.

There is nothing that Ahrefs can do to fix this, as you will have to contact your hosting provider.

Here is a template you can use to email your hosting provider:

Ahref Crawler Bots

Ahrefs use 2 bots to crawl websites, their marketing bot and their SEO bot. The SEO is bot is used for their Site Audit Tool, that looks through the health of a website.

Where as the marketing bot is used more frequently to get an overview for backlinks and domain authority, along with the Ahrefs keyword and link intersect tools.

Below are details for both bots, but you really only need to update your robots.txt file for the ‘AhrefsSiteAudit’ bot.

AhrefsSiteAudit Bot Details

AhrefsSiteAudit Bot
Version: 6.1
Bot Type: Good (Identifies itself, has an official moniker)
Category: SEO
Obeys Robots.txt: Yes by default (website owners can request to disobey robots.txt on their sites)
Obeys Crawl Delay: Yes by default (website owners can request to disobey crawl delay on their sites)
User-Agent String: Mozilla/5.0 (compatible; AhrefsSiteAudit/6.1; +
Reverse DNS suffix:

AhrefsBot Details

Version: 7.0
Bot Type: Good (Identifies itself, has an official moniker)
Category: Marketing
Obeys Robots.txt: Yes
Obeys Crawl Delay: Yes
User-Agent String: Mozilla/5.0 (compatible; AhrefsBot/7.0; +
Reverse DNS suffix:

Details updated as of June, 2021.

Did that fix your issue? I hope so. There are no other troubleshooting solutions from Ahrefs at the moment, but it seems that Whitelisting the IP Address Range is the most common fix.

Let me know how you get on, and share this troubleshooting guide with your friends and colleagues ⬇

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like