What Is Superfeedr?
Superfeedr is a fetcher operated by Superfeedr. If you think this is incorrect or can provide additional detail about its purpose, please let us know. You can see how often Superfeedr visits your website by setting up Dark Visitors Agent Analytics.
Agent Type
Expected Behavior
Fetchers retrieve metadata from web pages to generate link previews in social media platforms, messaging apps, and content aggregators. They're triggered on-demand when users share or post links, fetching information like titles, descriptions, and thumbnail images. Traffic is unpredictable and correlates with how often your content is shared. Viral content may trigger thousands of fetcher requests in a short period. Fetchers typically access only the shared URL rather than crawling your site.
Detail
| Operated By | Superfeedr |
| Last Updated | 19 hours ago |
Top Website Robots.txts
Country of Origin
Top Website Blocking Trend Over Time
The percentage of the world's top 1000 websites who are blocking Superfeedr
Overall Fetcher Traffic
The percentage of all internet traffic coming from fetchers
Top Visited Website Categories
User Agent String
| Example | Superfeedr bot/2.0 http://superfeedr.com - Make your feeds realtime: get in touch - feed-id:1402159430 |
Access other known user agent strings and recent IP addresses using the API.
Robots.txt
In this example, all pages are blocked. You can customize which pages are off-limits by swapping out / for a different disallowed path.
User-agent: Superfeedr # https://darkvisitors.com/agents/superfeedr
Disallow: /
Frequently Asked Questions About Superfeedr
Should I Block Superfeedr?
No. Blocking fetchers prevents link previews from appearing when your content is shared on social media, messaging apps, and other platforms. This significantly reduces click-through rates and social engagement. Link previews are crucial for content distribution.
How Do I Block Superfeedr?
If you want to, you can block or limit Superfeedr's access by configuring user agent token rules in your robots.txt file. The best way to do this is using Automatic Robots.txt, which blocks all agents of this type and updates continuously as new agents are released. While the vast majority of agents operated by reputable companies honor these robots.txt directives, bad actors may choose to ignore them entirely. In that case, you'll need to implement alternative blocking methods such as firewall rules or server-level restrictions. You can verify whether Superfeedr is respecting your rules by setting up Agent Analytics to monitor its visits to your website.
Will Blocking Superfeedr Hurt My SEO?
Blocking fetchers will hurt your social SEO and content distribution. Link previews significantly improve click-through rates from social media, messaging apps, and other platforms. Without previews, your content appears less engaging when shared, reducing social signals that can indirectly benefit search rankings.
Does Superfeedr Access Private Content?
Fetchers only access the specific URLs that users share or embed, without credentials or authentication. They're designed to retrieve publicly accessible metadata and preview information. Fetchers don't crawl beyond the shared URL and can't access private content unless the shared link itself provides public access to otherwise private information.
How Can I Tell if Superfeedr Is Visiting My Website?
Setting up Agent Analytics will give you realtime visibility into Superfeedr visiting your website, along with hundreds of other AI agents, crawlers, and scrapers. This will also let you measure human traffic to your website coming from AI search and chat LLM platforms like ChatGPT, Perplexity, and Gemini.
Why Is Superfeedr Visiting My Website?
Superfeedr visited your site because someone shared one of your URLs on a social platform, messaging app, or another service that generates link previews. The fetcher was triggered when the link was posted to retrieve your page's title, description, and preview image.
How Can I Authenticate Visits From Superfeedr?
Agent Analytics authenticates agent visits from many agents, letting you know whether each one was actually from that agent, or spoofed by a bad actor. This helps you identify suspicious traffic patterns and make informed decisions about blocking or allowing specific user agents.