robotstxt-ai-mcp
Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Capabilities
Features and capabilities supported by this server
| Capability | Details |
|---|---|
| tools | {
"listChanged": true
} |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| fetch_robotsA | Fetch and analyze a robots.txt file from a URL. Returns which AI bots are blocked or allowed. |
| analyze_robotsC | Analyze pasted robots.txt content. Returns which AI bots are blocked or allowed based on the rules. |
| generate_robotsC | Generate a robots.txt file with specified blocked bots, sitemap URLs, and custom rules. |
| list_ai_botsA | List all known AI bots with their user-agents, companies, and descriptions. Useful for deciding which bots to block. |
| check_bot_statusA | Check if a specific bot is blocked or allowed on a given website by fetching and analyzing its robots.txt. |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Latest Blog Posts
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/sharozdawa/robotstxt-ai'
If you have feedback or need assistance with the MCP directory API, please join our Discord server