AhrefsSiteAudit
SEO ToolsVerify AhrefsSiteAudit IP Address
Verify if an IP address truly belongs to Ahrefs, using official verification methods. Enter both IP address and User-Agent from your logs for the most accurate bot verification.
AhrefsSiteAudit is a crawler used by Ahrefs to power its Site Audit tool, which allows users to analyze websites for technical and on-page SEO issues. It performs structured crawls to detect problems such as broken links, redirect chains, crawlability issues, performance bottlenecks, and on-page SEO errors. This crawler is user-initiated, running only when an Ahrefs user configures or launches a site audit. Crawl activity depends on audit scope and settings, not continuous discovery. Its purpose is diagnostic rather than search indexing or competitive intelligence. RobotSense.io verifies AhrefsSiteAudit bot using Ahrefs official validation methods, ensuring only genuine AhrefsSiteAudit bot traffic is identified.
User Agent Examples
Mozilla/5.0 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)
Mozilla/5.0 (Linux; Android 13) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.5359.128 Mobile Safari/537.36 (compatible; AhrefsSiteAudit/6.1; +http://ahrefs.com/robot/site-audit)Robots.txt Configuration for AhrefsSiteAudit
AhrefsSiteAuditUse this identifier in your robots.txt User-agent directive to target AhrefsSiteAudit.
Recommended Configuration
Our recommended robots.txt configuration for AhrefsSiteAudit:
User-agent: AhrefsSiteAudit
Allow: /Completely Block AhrefsSiteAudit
Prevent this bot from crawling your entire site:
User-agent: AhrefsSiteAudit
Disallow: /Completely Allow AhrefsSiteAudit
Allow this bot to crawl your entire site:
User-agent: AhrefsSiteAudit
Allow: /Block Specific Paths
Block this bot from specific directories or pages:
User-agent: AhrefsSiteAudit
Disallow: /private/
Disallow: /admin/
Disallow: /api/Allow Only Specific Paths
Block everything but allow specific directories:
User-agent: AhrefsSiteAudit
Disallow: /
Allow: /public/
Allow: /blog/Set Crawl Delay
Limit how frequently AhrefsSiteAudit can request pages (in seconds):
User-agent: AhrefsSiteAudit
Allow: /
Crawl-delay: 10Note: This bot officially honors the Crawl-delay directive.