Skip to main content
Glama

LLM Bridge MCP

by sjquant

LLM 桥梁 MCP

LLM Bridge MCP 允许 AI 代理通过标准化接口与多个大型语言模型进行交互。它利用消息控制协议 (MCP) 无缝访问不同的 LLM 提供程序,从而轻松在模型之间切换或在同一应用程序中使用多个模型。

特征

  • 多个 LLM 提供商的统一接口:
    • OpenAI(GPT 模型)
    • 人类学(克劳德模型)
    • 谷歌(Gemini 型号)
    • DeepSeek
    • ...
  • 使用 Pydantic AI 构建,用于类型安全和验证
  • 支持可定制的参数,如温度和最大令牌
  • 提供使用情况跟踪和指标

工具

服务器实现了以下工具:

run_llm( prompt: str, model_name: KnownModelName = "openai:gpt-4o-mini", temperature: float = 0.7, max_tokens: int = 8192, system_prompt: str = "", ) -> LLMResponse
  • prompt :发送给 LLM 的文本提示
  • model_name :要使用的特定模型(默认值:“openai:gpt-4o-mini”)
  • temperature :控制随机性(0.0 到 1.0)
  • max_tokens :要生成的最大令牌数
  • system_prompt :可选的系统提示,用于指导模型的行为

安装

通过 Smithery 安装

要通过Smithery自动为 Claude Desktop 安装 llm-bridge-mcp:

npx -y @smithery/cli install @sjquant/llm-bridge-mcp --client claude

手动安装

  1. 克隆存储库:
git clone https://github.com/yourusername/llm-bridge-mcp.git cd llm-bridge-mcp
  1. 安装uv (如果尚未安装):
# On macOS brew install uv # On Linux curl -LsSf https://astral.sh/uv/install.sh | sh # On Windows powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

配置

使用您的 API 密钥在根目录中创建一个.env文件:

OPENAI_API_KEY=your_openai_api_key ANTHROPIC_API_KEY=your_anthropic_api_key GOOGLE_API_KEY=your_google_api_key DEEPSEEK_API_KEY=your_deepseek_api_key

用法

与 Claude Desktop 或 Cursor 一起使用

将服务器条目添加到您的 Claude Desktop 配置文件或.cursor/mcp.json

"mcpServers": { "llm-bridge": { "command": "uvx", "args": [ "llm-bridge-mcp" ], "env": { "OPENAI_API_KEY": "your_openai_api_key", "ANTHROPIC_API_KEY": "your_anthropic_api_key", "GOOGLE_API_KEY": "your_google_api_key", "DEEPSEEK_API_KEY": "your_deepseek_api_key" } } }

故障排除

常见问题
1. “spawn uvx ENOENT”错误

当系统在 PATH 中找不到uvx可执行文件时,就会发生此错误。解决方法如下:

解决方案:使用 uvx 的完整路径

找到 uvx 可执行文件的完整路径:

# On macOS/Linux which uvx # On Windows where.exe uvx

然后更新您的 MCP 服务器配置以使用完整路径:

"mcpServers": { "llm-bridge": { "command": "/full/path/to/uvx", // Replace with your actual path "args": [ "llm-bridge-mcp" ], "env": { // ... your environment variables } } }

执照

该项目根据 MIT 许可证获得许可 - 有关详细信息,请参阅 LICENSE 文件。

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

使 AI 代理能够通过标准化接口与多个 LLM 提供商(OpenAI、Anthropic、Google、DeepSeek)进行交互,从而可以轻松地在模型之间切换或在同一应用程序中使用多个模型。

  1. 特征
    1. 工具
      1. 安装
        1. 通过 Smithery 安装
        2. 手动安装
      2. 配置
        1. 用法
          1. 与 Claude Desktop 或 Cursor 一起使用
          2. 故障排除
        2. 执照

          Related MCP Servers

          • A
            security
            A
            license
            A
            quality
            Provides integration with OpenRouter.ai, allowing access to various AI models through a unified interface.
            Last updated -
            4
            154
            28
            TypeScript
            Apache 2.0
          • -
            security
            A
            license
            -
            quality
            This server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.
            Last updated -
            9
            Python
            MIT License
          • A
            security
            F
            license
            A
            quality
            A bridge that enables seamless integration of Ollama's local LLM capabilities into MCP-powered applications, allowing users to manage and run AI models locally with full API coverage.
            Last updated -
            10
            33
            JavaScript
            • Apple
          • -
            security
            F
            license
            -
            quality
            Enables communication and coordination between different LLM agents across multiple systems, allowing specialized agents to collaborate on tasks, share context, and coordinate work through a unified platform.
            Last updated -
            4
            TypeScript
            • Linux
            • Apple

          View all related MCP servers

          MCP directory API

          We provide all the information about MCP servers via our MCP API.

          curl -X GET 'https://glama.ai/api/mcp/v1/servers/sjquant/llm-bridge-mcp'

          If you have feedback or need assistance with the MCP directory API, please join our Discord server