Integrations
Provides a type-safe interface for accessing and interacting with OpenRouter.ai's diverse model ecosystem
OpenRouter MCP Server
A Model Context Protocol (MCP) server providing seamless integration with OpenRouter.ai's diverse model ecosystem. Access various AI models through a unified, type-safe interface with built-in caching, rate limiting, and error handling.
Features
- Model Access
- Direct access to all OpenRouter.ai models
- Automatic model validation and capability checking
- Default model configuration support
- Performance Optimization
- Smart model information caching (1-hour expiry)
- Automatic rate limit management
- Exponential backoff for failed requests
- Unified Response Format
- Consistent
ToolResult
structure for all responses - Clear error identification with
isError
flag - Structured error messages with context
- Consistent
Installation
Configuration
Prerequisites
- Get your OpenRouter API key from OpenRouter Keys
- Choose a default model (optional)
Environment Variables
Setup
Add to your MCP settings configuration file (cline_mcp_settings.json
or claude_desktop_config.json
):
Response Format
All tools return responses in a standardized structure:
Success Example:
Error Example:
Available Tools
chat_completion
Send messages to OpenRouter.ai models:
search_models
Search and filter available models:
get_model_info
Get detailed information about a specific model:
validate_model
Check if a model ID is valid:
Error Handling
The server provides structured errors with contextual information:
Common Error Categories:
Validation Error
: Invalid input parametersAPI Error
: OpenRouter API communication issuesRate Limit
: Request throttling detectionInternal Error
: Server-side processing failures
Handling Responses:
Development
See CONTRIBUTING.md for detailed information about:
- Development setup
- Project structure
- Feature implementation
- Error handling guidelines
- Tool usage examples
Changelog
See CHANGELOG.md for recent updates including:
- Unified response format implementation
- Enhanced error handling system
- Type-safe interface improvements
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
You must be authenticated.
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Provides integration with OpenRouter.ai, allowing access to various AI models through a unified interface.
Related MCP Servers
- -securityAlicense-qualityThis server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.Last updated -4PythonMIT License
- -securityAlicense-qualityA proxy server that bridges AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, enabling seamless interaction without custom integration code.Last updated -29PythonMIT License
- -securityAlicense-qualityEnables AI agents to interact with multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek) through a standardized interface, making it easy to switch between models or use multiple models in the same application.Last updated -3PythonMIT License
- AsecurityAlicenseAqualityAn AI router that connects applications to multiple LLM providers (OpenAI, Anthropic, Google, DeepSeek, Ollama, etc.) with smart model orchestration capabilities, enabling dynamic switching between models for different reasoning tasks.Last updated -31TypeScriptMIT License