Skip to main content
Glama
custom_models.json848 B
{ "models": [ { "model_name": "llama3.2", "friendly_name": "Custom (llama3.2)", "context_window": 32768, "max_output_tokens": 32768, "supports_extended_thinking": false, "supports_system_prompts": true, "supports_streaming": true, "supports_function_calling": false, "supports_images": false, "is_custom": true, "aliases": ["llama3.2:latest", "local-llama3.2"] }, { "model_name": "qwen2.5", "friendly_name": "Custom (qwen2.5)", "context_window": 32768, "max_output_tokens": 32768, "supports_extended_thinking": false, "supports_system_prompts": true, "supports_streaming": true, "supports_function_calling": false, "supports_images": false, "is_custom": true, "aliases": ["local-qwen2.5"] } ] }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Zazzles2908/EX_AI-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server