What Is MetaInspector?
MetaInspector is a scraper. If you think this is incorrect or can provide additional detail about its purpose, please let us know. You can see how often MetaInspector visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
Scrapers extract data from websites for various purposes including research, price monitoring, content aggregation, lead generation, and unauthorized copying. Their behavior is highly unpredictable due to diverse use cases and operators. Scrapers are frequently configured to ignore robots.txt rules and may aggressively crawl sites to achieve their collection goals. They can range from respectful tools that identify themselves clearly, to aggressive bots that disguise their identity, overwhelm servers, and extract content without permission.
Detail
| Last Updated | 1 day ago |
Top Website Robots.txts
Country of Origin
Top Website Blocking Trend Over Time
The percentage of the world's top 1000 websites who are blocking MetaInspector
Overall Scraper Traffic
The percentage of all internet traffic coming from scrapers
Top Visited Website Categories
User Agent String
| Example | MetaInspector/5.16.0 (+https://github.com/jaimeiniesta/metainspector) |
Access other known user agent strings and recent IP addresses using the API.
Robots.txt
In this example, all pages are blocked. You can customize which pages are off-limits by swapping out / for a different disallowed path.
User-agent: MetaInspector # https://darkvisitors.com/agents/metainspector
Disallow: /
Frequently Asked Questions About MetaInspector
Should I Block MetaInspector?
Often yes. Many scrapers extract content for unauthorized reuse, competitive intelligence, or data resale. They may ignore robots.txt and crawl aggressively. Consider blocking unless you specifically benefit from the scraping activity or the operator has permission.
How Do I Block MetaInspector?
If you want to, you can block or limit MetaInspector's access by configuring user agent token rules in your robots.txt file. The best way to do this is using Automatic Robots.txt, which blocks all agents of this type and updates continuously as new agents are released. While the vast majority of agents operated by reputable companies honor these robots.txt directives, bad actors may choose to ignore them entirely. In that case, you'll need to implement alternative blocking methods such as firewall rules or server-level restrictions. You can verify whether MetaInspector is respecting your rules by setting up Agent Analytics to monitor its visits to your website.
Will Blocking MetaInspector Hurt My SEO?
Blocking scrapers typically has no negative SEO impact and may actually protect your search rankings. Scrapers often steal content for competing websites or spam farms, which can create duplicate content issues and dilute your search authority. Blocking unauthorized scrapers generally benefits SEO.
Does MetaInspector Access Private Content?
Scrapers vary widely in their access scope. Many focus on publicly available content, but some may attempt to access password-protected areas, bypass paywalls, or use stolen credentials to access private content. Unauthorized scrapers may ignore access controls entirely and attempt to extract any accessible data, regardless of intended privacy boundaries.
How Can I Tell if MetaInspector Is Visiting My Website?
Setting up Agent Analytics will give you realtime visibility into MetaInspector visiting your website, along with hundreds of other AI agents, crawlers, and scrapers. This will also let you measure human traffic to your website coming from AI search and chat LLM platforms like ChatGPT, Perplexity, and Gemini.
Why Is MetaInspector Visiting My Website?
MetaInspector likely found your site through automated web discovery, following links from other sites, or by specifically targeting your domain because it contains data the operator wants to collect. Your site may have been identified as containing valuable content for their collection purposes.
How Can I Authenticate Visits From MetaInspector?
Agent Analytics authenticates agent visits from many agents, letting you know whether each one was actually from that agent, or spoofed by a bad actor. This helps you identify suspicious traffic patterns and make informed decisions about blocking or allowing specific user agents.