This server provides access to multiple Large Language Model (LLM) APIs including ChatGPT, Claude, and DeepSeek through a Model Context Protocol (MCP) interface, along with Bitcoin and Lightning network operations.
LLM Capabilities:
Call individual LLMs: Use tools like
call-chatgpt,call-claude, andcall-deepseekto send prompts to specific AI providers with configurable parameters like model, temperature, and token limitsCombine LLM responses: Use
call-all-llmsto send the same prompt to all available LLMs simultaneously and receive combined output with individual responses and a summaryDynamic provider selection: Use
call-llmto select an LLM provider ("chatgpt", "claude", or "deepseek") at runtimeCompare model outputs: Facilitate multi-perspective analysis, model comparison, and quality assurance
Bitcoin & Lightning Network Features:
Generate new Bitcoin key pairs and addresses
Validate Bitcoin addresses
Decode raw Bitcoin transactions from hexadecimal
Retrieve latest Bitcoin block information
Get specific Bitcoin transaction details using transaction ID
Decode BOLT11 Lightning invoices
Pay BOLT11 Lightning invoices
Configuration: Set environment variables for API keys (OPENAI_API_KEY, ANTHROPIC_API_KEY, DEEPSEEK_API_KEY) and default models for each provider.
Used for managing environment variables including API keys and default model configurations for the various LLM providers.
Provides access to OpenAI's ChatGPT API for generating responses from various GPT models with customizable parameters for temperature and token limits.
Implements schema validation for tool parameters to ensure proper formatting of requests to the different LLM APIs.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Cross-LLM MCP Servercall ChatGPT to explain quantum computing in simple terms"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
š¤ Cross-LLM MCP Server
Access multiple LLM APIs from one place. Call ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, and Mistral with intelligent model selection, preferences, and prompt logging.
An MCP (Model Context Protocol) server that provides unified access to multiple Large Language Model APIs for AI coding environments like Cursor and Claude Desktop.
Why Use Cross-LLM MCP?
š 8 LLM Providers ā ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, Mistral
šÆ Smart Model Selection ā Tag-based preferences (coding, business, reasoning, math, creative, general)
š Prompt Logging ā Track all prompts with history, statistics, and analytics
š° Cost Optimization ā Choose flagship or cheaper models based on preference
ā” Easy Setup ā One-click install in Cursor or simple manual setup
š Call All LLMs ā Get responses from all providers simultaneously
Related MCP server: URL Fetch MCP
Quick Start
Ready to access multiple LLMs? Install in seconds:
Install in Cursor (Recommended):
Or install manually:
Features
š¤ Individual LLM Tools
call-chatgptā OpenAI's ChatGPT APIcall-claudeā Anthropic's Claude APIcall-deepseekā DeepSeek APIcall-geminiā Google's Gemini APIcall-grokā xAI's Grok APIcall-kimiā Moonshot AI's Kimi APIcall-perplexityā Perplexity AI APIcall-mistralā Mistral AI API
š Combined Tools
call-all-llmsā Call all LLMs with the same promptcall-llmā Call a specific provider by name
āļø Preferences & Model Selection
get-user-preferencesā Get current preferencesset-user-preferencesā Set default model, cost preference, and tag-based preferencesget-models-by-tagā Find models by tag (coding, business, reasoning, math, creative, general)
š Prompt Logging
get-prompt-historyā View prompt history with filtersget-prompt-statsā Get statistics about prompt logsdelete-prompt-entriesā Delete log entries by criteriaclear-prompt-historyā Clear all prompt logs
Installation
Cursor (One-Click)
Click the install link above or use:
After installation, add your API keys in Cursor settings (see Configuration below).
Manual Installation
Requirements: Node.js 18+ and npm
Claude Desktop
Add to claude_desktop_config.json:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Restart Claude Desktop after configuration.
Configuration
API Keys
Set environment variables for the LLM providers you want to use:
Getting API Keys
Anthropic: https://console.anthropic.com/
DeepSeek: https://platform.deepseek.com/
Google Gemini: https://makersuite.google.com/app/apikey
xAI Grok: https://console.x.ai/
Moonshot AI: https://platform.moonshot.ai/
Perplexity: https://www.perplexity.ai/hub
Mistral: https://console.mistral.ai/
Usage Examples
Call ChatGPT
Get a response from OpenAI:
Call All LLMs
Get responses from all providers:
Set Tag-Based Preferences
Automatically use the best model for each task type:
Get Prompt History
View your prompt logs:
Model Tags
Models are tagged by their strengths:
coding:
deepseek-r1,deepseek-coder,gpt-4o,claude-3.5-sonnet-20241022business:
claude-3-opus-20240229,gpt-4o,gemini-1.5-proreasoning:
deepseek-r1,o1-preview,claude-3.5-sonnet-20241022math:
deepseek-r1,o1-preview,o1-minicreative:
gpt-4o,claude-3-opus-20240229,gemini-1.5-progeneral:
gpt-4o-mini,claude-3-haiku-20240307,gemini-1.5-flash
Use Cases
Multi-Perspective Analysis ā Get different perspectives from multiple LLMs
Model Comparison ā Compare responses to understand strengths and weaknesses
Cost Optimization ā Choose the most cost-effective model for each task
Quality Assurance ā Cross-reference responses from multiple models
Intelligent Selection ā Automatically use the best model for coding, business, reasoning, etc.
Prompt Analytics ā Track usage, costs, and patterns with automatic logging
Technical Details
Built with: Node.js, TypeScript, MCP SDK
Dependencies: @modelcontextprotocol/sdk, superagent, zod
Platforms: macOS, Windows, Linux
Preference Storage:
Unix/macOS:
~/.cross-llm-mcp/preferences.jsonWindows:
%APPDATA%/cross-llm-mcp/preferences.json
Prompt Log Storage:
Unix/macOS:
~/.cross-llm-mcp/prompts.jsonWindows:
%APPDATA%/cross-llm-mcp/prompts.json
Contributing
ā If this project helps you, please star it on GitHub! ā
Contributions welcome! Please open an issue or submit a pull request.
License
MIT License ā see LICENSE.md for details.
Support
If you find this project useful, consider supporting it:
ā” Lightning Network
āæ Bitcoin: bc1ptzvr93pn959xq4et6sqzpfnkk2args22ewv5u2th4ps7hshfaqrshe0xtp
Ī Ethereum/EVM: 0x42ea529282DDE0AA87B42d9E83316eb23FE62c3f