OpenRouter MCP Server

chat_completion

Send a message to OpenRouter.ai and get a response

Input Schema

NameRequiredDescriptionDefault
messagesYesAn array of conversation messages with roles and content
modelNoThe model to use (e.g., "google/gemini-2.0-flash-thinking-exp:free", "undi95/toppy-m-7b:free"). If not provided, uses the default model if set.
temperatureNoSampling temperature (0-2)

Input Schema (JSON Schema)

{ "properties": { "messages": { "description": "An array of conversation messages with roles and content", "items": { "properties": { "content": { "description": "The content of the message", "type": "string" }, "role": { "description": "The role of the message sender", "enum": [ "system", "user", "assistant" ], "type": "string" } }, "required": [ "role", "content" ], "type": "object" }, "maxItems": 100, "minItems": 1, "type": "array" }, "model": { "description": "The model to use (e.g., \"google/gemini-2.0-flash-thinking-exp:free\", \"undi95/toppy-m-7b:free\"). If not provided, uses the default model if set.", "type": "string" }, "temperature": { "description": "Sampling temperature (0-2)", "maximum": 2, "minimum": 0, "type": "number" } }, "required": [ "messages" ], "type": "object" }