Sucuri

What is Sucuri?

About

Sucuri crawls websites to detect malware, security vulnerabilities, and blacklist status as part of its cloud-based website security platform and malware removal services. You can see how often Sucuri visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Security scanners do not follow a predictable schedule when visiting websites. Their scans can be one-time, occasional, or recurring depending on the purpose of the scanner and the organization's security practices. The frequency and depth of their scans can vary based on factors like the visibility of the site on the public internet, past scan results, and inclusion in external threat intelligence feeds.

Type

Security Scanner
Scans websites to find vulnerabilities

Detail

Operated By Sucuri
Last Updated 14 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking Sucuri
Learn How →

Country of Origin

United States
Sucuri normally visits from the United States

Global Traffic

The percentage of all internet traffic coming from Security Scanners

Top Visited Website Categories

Health
Home and Garden
Shopping
Arts and Entertainment
Business and Industrial
How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Sucuri?

Probably not. Security scanners can be beneficial, especially if they're configured to report issues back to you.

How Do I Block Sucuri?

You can block Sucuri or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Security Scanners?
Serve a continuously updating robots.txt that blocks new security scanners automatically.
User Agent String Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6 MSIE 7.0 Sucuri Integrity Monitor/2.4
# In your robots.txt ...

User-agent: Sucuri # https://darkvisitors.com/agents/sucuri
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References