Skip to main content
Glama

mcp-local-dev

by txbm
pyproject.toml209 B
[build-system] requires = ["hatchling"] build-backend = "hatchling.build" [project] name = "example" version = "0.1.0" description = "Example project for testing" requires-python = ">=3.12" dependencies = []

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/txbm/mcp-local-dev'

If you have feedback or need assistance with the MCP directory API, please join our Discord server