Skip to main content
Glama

MCP Server For Local

pyproject.toml292 B
[project] name = "mcp-client" version = "0.1.0" description = "Add your description here" readme = "README.md" requires-python = ">=3.11" dependencies = [ "bilibili-api", "httpx>=0.28.1", "mcp[cli]>=1.6.0", "openai>=1.70.0", "python-dotenv>=1.1.0", "pyyaml!=5.4.1", ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Dreamboat-Rachel/MCP-Server-For-Local'

If you have feedback or need assistance with the MCP directory API, please join our Discord server