The OpenRouter MCP Server provides a unified interface to interact with OpenRouter.ai's AI models through the Model Context Protocol (MCP), offering:
Model Interaction: Send messages to AI models via chat completion with customizable parameters like temperature
Model Discovery: Search and filter models by provider, capabilities (tools, vision, functions, JSON mode), context length, and pricing
Model Information: Retrieve detailed specifications for any model ID and validate IDs for recognition
Performance Optimizations: Utilizes model caching, rate limit management, and exponential backoff for retries
Structured Responses: All results follow a standardized
ToolResult
format with clear error handlingSimple Configuration: Easy setup through environment variables and MCP configuration files
Provides a type-safe interface for accessing and interacting with OpenRouter.ai's diverse model ecosystem
OpenRouter MCP Server
A Model Context Protocol (MCP) server providing seamless integration with OpenRouter.ai's diverse model ecosystem. Access various AI models through a unified, type-safe interface with built-in caching, rate limiting, and error handling.
Features
- Model Access
- Direct access to all OpenRouter.ai models
- Automatic model validation and capability checking
- Default model configuration support
- Performance Optimization
- Smart model information caching (1-hour expiry)
- Automatic rate limit management
- Exponential backoff for failed requests
- Unified Response Format
- Consistent
ToolResult
structure for all responses - Clear error identification with
isError
flag - Structured error messages with context
- Consistent
Installation
Configuration
Prerequisites
- Get your OpenRouter API key from OpenRouter Keys
- Choose a default model (optional)
Environment Variables
Setup
Add to your MCP settings configuration file (cline_mcp_settings.json
or claude_desktop_config.json
):
Response Format
All tools return responses in a standardized structure:
Success Example:
Error Example:
Available Tools
chat_completion
Send messages to OpenRouter.ai models:
search_models
Search and filter available models:
get_model_info
Get detailed information about a specific model:
validate_model
Check if a model ID is valid:
Error Handling
The server provides structured errors with contextual information:
Common Error Categories:
Validation Error
: Invalid input parametersAPI Error
: OpenRouter API communication issuesRate Limit
: Request throttling detectionInternal Error
: Server-side processing failures
Handling Responses:
Development
See CONTRIBUTING.md for detailed information about:
- Development setup
- Project structure
- Feature implementation
- Error handling guidelines
- Tool usage examples
Changelog
See CHANGELOG.md for recent updates including:
- Unified response format implementation
- Enhanced error handling system
- Type-safe interface improvements
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Provides integration with OpenRouter.ai, allowing access to various AI models through a unified interface.
Related MCP Servers
- -securityFlicense-qualityEnables AI models to interact with Jira using a standardized protocol, offering full Jira REST API integration with features like optimal performance through connection pooling, error handling, and request monitoring.Last updated -2
- AsecurityAlicenseAqualityAn AI router that connects applications to multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek, Ollama, etc.) with smart model orchestration capabilities, enabling dynamic switching between models for different reasoning tasks.Last updated -32021MIT License
- -securityFlicense-qualityA unified API server that enables interaction with multiple AI model providers like Anthropic and OpenAI through a consistent interface, supporting chat completions, tool calling, and context handling.Last updated -
- -security-license-qualityIntelligent routing service that selects optimal AI models based on capability requirements and normalizes input/output formats across multiple providers like OpenAI, Anthropic, Google, and others.Last updated -