You have verified your website in Google Search Console, and you want to run a website audit with the Ahrefs Site Audit Tool, but your audit keeps failing.
It might have worked the first time, but now when you try to audit your website, you get the same error:
I had this issue with my website, each time I tried an audit, I got this message:
Let’s take a look at some troubleshooting to fix this issue.
Jump to fix:
Troubleshooting
There are a few steps we can look at to avoid this issue, and make it that the Ahrefs Site Audit Tool can succesfully audit your website.
Update your Robots.txt file
You might need to update your Robots.txt file. This is the file that tells website crawlers, such as Google Bot, what can be crawled on your website.
Check your Robots.txt file to see if you are allowing website crawlers access to your website.
A normal Robots.txt file looks like this:
User-agent: *
Disallow:
Add the Ahrefs Crawler to your Robots.txt file
1 – Go to your Robots.txt file
If you have a WordPress website, you can find this through the Yoast Plugin. Go to Yoast SEO, click on Tools, then click on File editor. You will now see your Robots.txt file at the top.
If you don’t have WordPress, you might need to ask your web developer for help. If you use a website builder, like Wix or SquareSpace, you can find the Robots.txt file in the admin section of you website.
A Robots.txt file should look something like this: (WordPress website)
User-agent: *
Disallow: /wp-admin/
The *
on the User-agent:
means that any crawler has access to this website – to be sure, we can add the Ahrefs Crawler.
The Disallow:
section can list pages and links that you don’t want a crawler to look at. For example, if you have a hidden page, or pages that you do not want to appear on a search engine, you can add them with Disallow: – one row at a at time.
2 – Add the Ahrefs Crawler
This is the Ahrefs Crawler:
User-agent: AhrefsSiteAudit
Allow: /
To add it to your Robots.txt, add a new line, and add the code as above:
User-agent: *
Disallow: /wp-admin/
User-agent: AhrefsSiteAudit
Allow: /
3 – Save changes and try your Audit again
Save the changes to your Robots.txt file and refresh the Ahrefs Site Audit Tool – now try your audit again.
Still no luck? Try the next solution.
Test your Robots.txt file can be crawled by Ahrefs
Ahrefs have a useful tool that you can use to check if they can crawl your website.
Go to ahrefs.com/robot/site-audit and enter your full website URL. This will tell you if your website can be crawled.
Still getting the ‘This website can’t be crawled.’ error?
Try the next solution.
Whitelist the Ahrefs IP Address Range
Ahrefs uses an ‘IP adress range’ – this means that they use different IP addresses to crawl your website.
In some cases, your hosting provider my have blocked these IP addresses, meaning that the Ahrefs Crawler can’t audit your website.
You will need to send a support ticket, or contact your hosting provider and send them the IP range that Ahrefs use.
Ahrefs IP Address Range:
54.36.148.0/24
54.36.149.0/24
195.154.122.0/24
195.154.123.0/24
195.154.126.0/24
195.154.127.0/24
These might be subject to change, so please refer to this link to see the most updated information – https://help.ahrefs.com/en/articles/78658-what-is-the-list-of-your-ip-ranges
Did that fix your issue? I hope so. There are no other troubleshooting solutions from Ahrefs at the moment, but it seems that Whitelisting the IP Address Range is the most common fix.
Let me know how you get on, and share this troubleshooting guide with your friends and colleagues ⬇
4 comments
My website still do not crawl able. Can you tell me the issue.
Hello, on your website patracorp.com, you have not included the Ahrefs bit in your Robots.txt file. See Add the Ahrefs Crawler to your Robots.txt file. If this still does not work, it could be that your hosting provider has whitelisted Ahrefs, and you will need to contact them, see Whitelist the Ahrefs IP Address Range.
I hope that helps.
Lots of thanks for sharing this informative post. I faced same problem and now it is solved by following your guidelines.
Thank you Mahadi, I am glad you found it useful 👍