SiteAuditBot
What is SiteAuditBot?
About
SiteAuditBot crawls websites to analyze over 130 SEO and technical issues, examining crawlability, markups, internal linking, performance, HTTPS, and international SEO factors for optimization insights. You can see how often SiteAuditBot visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
SEO crawlers analyze websites to gather search optimization data like keyword rankings, backlink profiles, site structure, and page performance. Most of them also build up proprietary databases that power SEO analytics tools and competitive intelligence platforms. Crawl frequency varies based on factors like site authority, backlink popularity, ranking performance, and whether the site is actively monitored by the service's customers. These crawlers typically perform comprehensive site scans, following internal links to map site architecture and assess optimization opportunities.
Detail
Operated By | Semrush |
Last Updated | 13 hours ago |
Insights
Top Website Robots.txts
Country of Origin
Global Traffic
The percentage of all internet traffic coming from SEO Crawlers
Top Visited Website Categories
Robots.txt
Should I Block SiteAuditBot?
Probably not, especially if you benefit from an SEO service yourself. However, you might choose to block them if you're concerned about things like server resource usage.
How Do I Block SiteAuditBot?
You can block SiteAuditBot or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.
User Agent String | Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; SiteAuditBot/0.97; +http://www.semrush.com/bot.html) |
# In your robots.txt ...
User-agent: SiteAuditBot # https://darkvisitors.com/agents/siteauditbot
Disallow: /
⚠️ Manual Robots.txt Editing Is Not Scalable
New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.