OpenRouter MCP Server
chat_completion
Send a message to OpenRouter.ai and get a response
Input Schema
Name | Required | Description | Default |
---|---|---|---|
messages | Yes | An array of conversation messages with roles and content | |
model | No | The model to use (e.g., "google/gemini-2.0-flash-thinking-exp:free", "undi95/toppy-m-7b:free"). If not provided, uses the default model if set. | |
temperature | No | Sampling temperature (0-2) |