Skip to main content
Glama

MCP-llms-txt

pyproject.toml454 B
[project] name = "mcp-llms-txt" version = "0.2.0" description = "Add your description here" readme = "README.md" authors = [ { name = "TerminalMan", email = "84923604+SecretiveShell@users.noreply.github.com" } ] requires-python = ">=3.11" dependencies = [ "httpx>=0.28.1", "mcp>=1.2.1", "pydantic>=2.10.6", ] [project.scripts] mcp-llms-txt = "mcp_llms_txt:main" [build-system] requires = ["hatchling"] build-backend = "hatchling.build"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/SecretiveShell/MCP-llms-txt'

If you have feedback or need assistance with the MCP directory API, please join our Discord server