ICC-Crawler

What is ICC-Crawler?

About

ICC-Crawler is NICT's research crawler that automatically collects web pages from the Internet for academic research at Japan's National Institute of Information and Communications Technology. You can see how often ICC-Crawler visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

It's generally unclear how AI data scrapers choose which websites to crawl and how often to crawl them. They might choose to visits websites with a higher information density more frequently, depending on the type of AI models they're training. For example, it would make sense that an agent training an LLM (Large Language Model) would favor sites with a lot of regularly updating text content.

Type

AI Data Scraper
Downloads web content to train AI models

Detail

Operated By NICT
Last Updated 14 hours ago

Insights

Top Website Robots.txts

3%
3% of top websites are blocking ICC-Crawler
Learn How →

Country of Origin

Japan
ICC-Crawler normally visits from Japan

Global Traffic

The percentage of all internet traffic coming from AI Data Scrapers

How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block ICC-Crawler?

It's up to you. AI data scrapers usually download publicly available internet content, which is freely accessible by default. However, you might want to block them if you're concerned about attribution or how your creative work could be used in the resulting AI model.

How Do I Block ICC-Crawler?

You can block ICC-Crawler or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All AI Data Scrapers?
Serve a continuously updating robots.txt that blocks new AI data scrapers automatically.
User Agent String ICC-Crawler/3.0 (Mozilla-compatible; ; https://ucri.nict.go.jp/en/icccrawler.html)
# In your robots.txt ...

User-agent: ICC-Crawler # https://darkvisitors.com/agents/icc-crawler
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.

References