Google-Structured-Data-Testing-Tool

What is Google-Structured-Data-Testing-Tool?

About

Google-Structured-Data-Testing-Tool is an uncategorized agent. If you think this is incorrect or can provide additional detail about its purpose, please contact us. You can see how often Google-Structured-Data-Testing-Tool visits your website by setting up Dark Visitors Agent Analytics.

Expected Behavior

Behavior will vary depending on whether this agent is a search engine crawler, data scraper, archiver, one-off fetcher, etc.

Type

Uncategorized
Not currently assigned a type

Detail

Last Updated 14 hours ago

Insights

Top Website Robots.txts

0%
0% of top websites are blocking Google-Structured-Data-Testing-Tool
Learn How →

Country of Origin

Unknown
Google-Structured-Data-Testing-Tool has no known country of origin

Global Traffic

The percentage of all internet traffic coming from Uncategorized Agents

How Do I Get These Insights for My Website?
Use the WordPress plugin, Node.js package, or API to get started in seconds.

Robots.txt

Should I Block Google-Structured-Data-Testing-Tool?

It's difficult to say without a type. Its purposes could either be good or bad for your website, depending on what it is.

How Do I Block Google-Structured-Data-Testing-Tool?

You can block Google-Structured-Data-Testing-Tool or limit its access by setting user agent token rules in your website's robots.txt. Set up Dark Visitors Agent Analytics to check whether it's actually following them.

How Do I Block All Uncategorized Agents?
Serve a continuously updating robots.txt that blocks new uncategorized agents automatically.
User Agent String Mozilla/5.0 (compatible; Google-Structured-Data-Testing-Tool +https://search.google.com/structured-data/testing-tool)
# In your robots.txt ...

User-agent: Google-Structured-Data-Testing-Tool # https://darkvisitors.com/agents/google-structured-data-testing-tool
Disallow: /

⚠️ Manual Robots.txt Editing Is Not Scalable

New agents are created every day. We recommend setting up Dark Visitors Automatic Robots.txt if you want to block all agents of this type.