Skip to main content
Glama
config.example.yamlβ€’601 B
watch_dir: "./your-project-directory" memory_file: ".fgd_memory.json" context_limit: 20 scan: max_dir_size_gb: 2 max_files_per_scan: 5 max_file_size_kb: 250 reference_dirs: - "/path/to/docs" - "/path/to/shared-lib" llm: default_provider: "grok" providers: grok: model: "grok-3" base_url: "https://api.x.ai/v1" openai: model: "gpt-4o-mini" base_url: "https://api.openai.com/v1" claude: model: "claude-3-5-sonnet-20241022" base_url: "https://api.anthropic.com/v1" ollama: model: "llama3" base_url: "http://localhost:11434/v1"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/mikeychann-hash/MCPM'

If you have feedback or need assistance with the MCP directory API, please join our Discord server