Skip to main content
Glama

mcp-server-collector

by chatmcp
pyproject.toml508 B
[project] name = "mcp-server-collector" version = "0.1.0" description = "A MCP Server used to collect MCP Servers over the internet." readme = "README.md" requires-python = ">=3.12" dependencies = [ "aiohttp>=3.11.10", "mcp>=1.1.0", "openai>=1.57.0", "python-dotenv>=1.0.1", "requests>=2.32.3", ] [[project.authors]] name = "idoubi" email = "me@idoubi.cc" [build-system] requires = [ "hatchling",] build-backend = "hatchling.build" [project.scripts] mcp-server-collector = "mcp_server_collector:main"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/chatmcp/mcp-server-collector'

If you have feedback or need assistance with the MCP directory API, please join our Discord server