What Is bnf.fr_bot?
bnf.fr_bot is the official web crawler of the Bibliothèque nationale de France (BNF), systematically collecting and archiving digital content from French websites to preserve France's national documentary heritage. You can see how often bnf.fr_bot visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
Archivers crawl websites to create historical snapshots for preservation purposes. They typically visit on a regular cadence to build a chronological record of how content changes over time. Crawl frequency varies based on site popularity and content update patterns. Unlike search crawlers, archivers aim to capture and store complete page states rather than extract information for indexing.
Detail
| Operated By | Bibliothèque nationale de France |
| Last Updated | 17 hours ago |
Top Website Robots.txts
Country of Origin
Top Website Blocking Trend Over Time
The percentage of the world's top 1000 websites who are blocking bnf.fr_bot
Overall Archiver Traffic
The percentage of all internet traffic coming from archivers
User Agent String
| Example | Mozilla/5.0 (compatible; bnf.fr_bot; +https://www.bnf.fr/fr/capture-de-votre-site-web-par-le-robot-de-la-bnf) |
Access other known user agent strings and recent IP addresses using the API.
Robots.txt
In this example, all pages are blocked. You can customize which pages are off-limits by swapping out / for a different disallowed path.
User-agent: bnf.fr_bot # https://darkvisitors.com/agents/bnf-fr-bot
Disallow: /
Frequently Asked Questions About bnf.fr_bot
Should I Block bnf.fr_bot?
It depends on your goals. Digital archiving preserves cultural and historical records for future generations. Most website owners appreciate being included in archives like the Wayback Machine. However, if you handle sensitive content or prefer not to have historical snapshots, you can block archivers.
How Do I Block bnf.fr_bot?
If you want to, you can block or limit bnf.fr_bot's access by configuring user agent token rules in your robots.txt file. The best way to do this is using Automatic Robots.txt, which blocks all agents of this type and updates continuously as new agents are released. While the vast majority of agents operated by reputable companies honor these robots.txt directives, bad actors may choose to ignore them entirely. In that case, you'll need to implement alternative blocking methods such as firewall rules or server-level restrictions. You can verify whether bnf.fr_bot is respecting your rules by setting up Agent Analytics to monitor its visits to your website.
Will Blocking bnf.fr_bot Hurt My SEO?
Blocking archivers has no direct SEO impact since they don't influence search engine rankings. However, archived content can provide historical context and backlink opportunities. Some SEO tools also reference archived data for analysis, so blocking might limit certain SEO insights.
Does bnf.fr_bot Access Private Content?
Archivers typically focus on publicly accessible content to create historical records. They generally don't attempt to access password-protected or private content, as their goal is to preserve public web history. However, they may archive content that's publicly accessible but not intended for long-term preservation, such as temporary pages or draft content.
How Can I Tell if bnf.fr_bot Is Visiting My Website?
Setting up Agent Analytics will give you realtime visibility into bnf.fr_bot visiting your website, along with hundreds of other AI agents, crawlers, and scrapers. This will also let you measure human traffic to your website coming from AI search and chat LLM platforms like ChatGPT, Perplexity, and Gemini.
Why Is bnf.fr_bot Visiting My Website?
bnf.fr_bot discovered your site through web discovery methods or your site was submitted to their archiving service. Your content was selected for preservation either as part of broad web archiving efforts or because it was specifically nominated for historical preservation.
How Can I Authenticate Visits From bnf.fr_bot?
Agent Analytics authenticates agent visits from many agents, letting you know whether each one was actually from that agent, or spoofed by a bad actor. This helps you identify suspicious traffic patterns and make informed decisions about blocking or allowing specific user agents.