Skip to main content
Glama
metadata.json1.35 kB
{ "timestamp": "2025-09-09_09-32-29", "default_model": "glm-4.5-flash", "providers": { "GLM_API_KEY": true, "KIMI_API_KEY": true, "OPENROUTER_API_KEY": false }, "git_commit": null, "summary": [ { "tool": "analyze", "status": "success", "provider": "glm", "model": "glm-4.5-flash", "duration_sec": 0.01 }, { "tool": "challenge", "status": "success", "provider": null, "model": null, "duration_sec": 0.01 }, { "tool": "chat", "status": "success", "provider": "glm", "model": "glm-4.5-air", "duration_sec": 7.34 }, { "tool": "consensus", "status": "success", "provider": null, "model": null, "duration_sec": 0.01 }, { "tool": "listmodels", "status": "success", "provider": null, "model": null, "duration_sec": 0.01 }, { "tool": "orchestrate_auto", "status": "success", "provider": null, "model": null, "duration_sec": 9.64 }, { "tool": "thinkdeep", "status": "success", "provider": null, "model": null, "duration_sec": 0.01 }, { "tool": "version", "status": "success", "provider": null, "model": null, "duration_sec": 0.01 } ] }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Zazzles2908/EX_AI-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server