Skip to main content
Glama

llms-txt-mcp

by tenequm
.mcp.json515 B
{ "mcpServers": { "gitmcp": { "command": "npx", "args": ["-y", "mcp-remote", "https://gitmcp.io/docs"] }, "Context7": { "command": "npx", "args": ["-y", "@upstash/context7-mcp"] }, "llms-txt-mcp": { "command": "uv", "args": [ "run", "llms-txt-mcp", "--store-path", "tmp/mcp-data", "https://ai-sdk.dev/llms.txt", "https://gofastmcp.com/llms.txt", "https://modelcontextprotocol.io/llms.txt" ] } } }

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/tenequm/llms-txt-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server