Skip to main content
Glama

MCP Chat

  • Apple
  • Linux
.env.example133 B
LLM_MODEL_API_KEY="" LLM_CHAT_COMPLETION_URL="https://generativelanguage.googleapis.com/v1beta/openai/" LLM_MODEL="gemini-2.5-flash"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Abdullah-1121/MCP-2'

If you have feedback or need assistance with the MCP directory API, please join our Discord server