We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-conversation-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server
models.yaml•1.22 kB
# OpenRouter Models Configuration
# Visit https://openrouter.ai/docs#models for the complete list of available models
# MCP Server Configuration
openRouter:
apiKey: "<<OPEN ROUTER>>" # Replace with your actual OpenRouter API key.
persistence:
path: "d:/projects/conversations" # Optional: Directory for storing conversation data.
models:
'google/gemini-2.0-pro-exp-02-05:free':
id: 'google/gemini-2.0-pro-exp-02-05:free'
contextWindow: 2000000
streaming: true
temperature: 0.2
description: 'Google Gemini 2.0 Pro is a powerful and versatile language model that can handle a wide range of tasks.'
'google/gemini-2.0-flash-001':
id: 'google/gemini-2.0-flash-001'
contextWindow: 1000000
streaming: true
temperature: 0.2
description: 'Google Gemini 2.0 Flash is a powerful and versatile language model that can handle a wide range of tasks.'
# Add more models as needed following the same format
# Example:
# 'provider/model-name':
# id: 'provider/model-name'
# contextWindow: <window_size>
# streaming: true/false
# description: 'Model description'
# Default model to use if none specified
defaultModel: 'google/gemini-2.0-pro-exp-02-05:free'