Provides tools to analyze and manage robots.txt rules for Amazonbot, enabling control over how Amazon's AI crawlers access website content.
Provides tools to analyze and manage robots.txt rules for Baiduspider, enabling control over how Baidu's search crawlers access website content.
Provides tools to analyze and manage robots.txt rules for Googlebot and Google-Extended, enabling control over how Google's search and AI agents crawl website content.
Provides tools to analyze and manage robots.txt rules for Meta-ExternalAgent, enabling control over how Meta's AI agents crawl website content.
Provides tools to analyze and manage robots.txt rules for PerplexityBot, enabling control over how Perplexity's AI search agents crawl website content.
Manage which AI bots can crawl your website — visually.
Toggle GPTBot, ClaudeBot, PerplexityBot, and 20+ AI crawlers on/off with a simple UI. Analyze any site's robots.txt instantly.
Features
Visual Toggle UI — Block or allow AI bots with simple on/off switches
20+ AI Bots Database — GPTBot, ClaudeBot, Google-Extended, CCBot, Bytespider, Diffbot, cohere-ai, Amazonbot, Meta-ExternalAgent, and more
Analyze Existing robots.txt — Paste or fetch any robots.txt to see which AI bots are blocked
Generate robots.txt — Create a complete robots.txt with your chosen rules
MCP Server — Use with Claude Desktop, Cursor, or any MCP-compatible AI assistant
Check Bot Status — Verify if a specific bot is blocked on any website
MCP Tools
Tool | Description |
| Fetch and analyze a robots.txt from any URL |
| Analyze pasted robots.txt content for AI bot blocking status |
| Generate a robots.txt with specified blocked bots and custom rules |
| List all known AI bots with user-agents, companies, and descriptions |
| Check if a specific bot is blocked on a given website |
Installation
Web App
git clone https://github.com/sharozdawa/robotstxt-ai.git
cd robotstxt-ai
npm install
npm run devOpen http://localhost:3000 in your browser.
MCP Server — Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"robotstxt-ai": {
"command": "npx",
"args": ["-y", "robotstxt-ai-mcp"]
}
}
}MCP Server — Cursor
Add to .cursor/mcp.json:
{
"mcpServers": {
"robotstxt-ai": {
"command": "npx",
"args": ["-y", "robotstxt-ai-mcp"]
}
}
}Tracked Bots
The server knows about 25+ bots including:
AI Crawlers: GPTBot, ClaudeBot, Google-Extended, CCBot, Bytespider, Diffbot, cohere-ai, Amazonbot, Meta-ExternalAgent
AI Search: ChatGPT-User, OAI-SearchBot, PerplexityBot, YouBot
Search Engines: Googlebot, Bingbot, YandexBot, Baiduspider, DuckDuckBot
Why robotstxt.ai vs Manual Editing
Feature | robotstxt.ai | Manual Editing |
Visual toggle UI | Yes | No |
20+ AI bots database | Yes | Research yourself |
Analyze existing robots.txt | Yes | No |
MCP Server | Yes | No |
Price | Free | Free but tedious |
More Open Source SEO Tools
Tool | Description |
Curated list of SEO MCP servers and agent skills | |
Instant URL indexing via IndexNow | |
Schema.org JSON-LD markup generator | |
AI brand visibility tracker |
License
MIT
Built by Sharoz Dawa — SEO Professional & Digital Marketing Expert