OpenAI MCP Server

by thadius83
MIT License
1
  • Apple
  • Linux

Integrations

  • Allows querying OpenAI models (o3-mini and gpt-4o-mini) directly from Claude using the MCP protocol, enabling users to ask questions and receive responses from OpenAI's AI models

OpenAI MCP 服务器

使用 MCP 协议直接从 Claude 查询 OpenAI 模型。此分支添加了对 o3-mini 和 gpt-4o-mini 模型的支持,并改进了消息处理。

Cline自动安装

Set up the MCP server from https://github.com/thadius83/mcp-server-openai. Use "github.com/thadius83/mcp-server-openai" as the server name in cline_mcp_settings.json. Analyse the readme and instructions below. Do not create new files in the repo, utilise the findings from pyproject.toml, src/mcp_server_openai/server.py, src/mcp_server_openai/llm.py Once installed, demonstrate the server's capabilities by using one of its tools. Installation Steps: # Clone the repository git clone https://github.com/thadius83/mcp-server-openai cd mcp-server-openai # Install the package pip install .` MCP Settings Configuration: The cline_mcp_settings.json should be configured with: Correct server name format: "github.com/thadius83/mcp-server-openai" Python module path structure for the server PYTHONPATH environment variable pointing to the project directory OpenAI API key passed as a command line argument Example configuration: { "mcpServers": { "github.com/thadius83/mcp-server-openai": { "command": "python", "args": [ "-m", "src.mcp_server_openai.server", "--openai-api-key", "your-openai-api-key" ], "env": { "PYTHONPATH": "/path/to/mcp-server-openai" }, "disabled": false, "autoApprove": [] } } } Requirements: Python >= 3.10 OpenAI API key Dependencies installed via pip (mcp>=0.9.1, openai>=1.0.0, click>=8.0.0, pytest-asyncio) Available Tools: Tool Name: ask-openai Description: Ask OpenAI assistant models a direct question Models Available: o3-mini (default) gpt-4o-mini Input Schema: { "query": "Your question here", "model": "o3-mini" // optional, defaults to o3-mini }

特征

  • 与 OpenAI 的 API 直接集成
  • 支持多种模型:
    • o3-mini(默认):针对简洁响应进行了优化
    • GPT-4O-mini:增强模型,可提供更详细的响应
  • 可配置的消息格式
  • 错误处理和日志记录
  • 通过 MCP 协议实现简单的接口

安装

通过 Smithery 安装

要通过Smithery自动为 Claude Desktop 安装 OpenAI MCP 服务器:

npx -y @smithery/cli install @thadius83/mcp-server-openai --client claude

手动安装

  1. 克隆存储库
git clone https://github.com/thadius83/mcp-server-openai.git cd mcp-server-openai # Install dependencies pip install -e .
  1. 配置Claude桌面

将此服务器添加到您现有的 MCP 设置配置中。注意:请保留配置中现有的 MCP 服务器 - 只需将此服务器添加到它们旁边即可。

地点:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%/Claude/claude_desktop_config.json
  • Linux:检查你的主目录( ~/ )以获取默认的 MCP 设置位置
{ "mcpServers": { // ... keep your existing MCP servers here ... "github.com/thadius83/mcp-server-openai": { "command": "python", "args": ["-m", "src.mcp_server_openai.server", "--openai-api-key", "your-key-here"], "env": { "PYTHONPATH": "/path/to/your/mcp-server-openai" } } } }
  1. 获取 OpenAI API 密钥
    • 访问OpenAI 的网站
    • 创建账户或登录
    • 导航至 API 设置
    • 生成新的 API 密钥
    • 将密钥添加到您的配置文件中,如上所示
  2. 重启克劳德
    • 更新配置后,重新启动 Claude 以使更改生效

用法

该服务器提供了一个单独的工具ask-openai ,可用于查询 OpenAI 模型。你可以直接在 Claude 中使用 use_mcp_tool 命令来使用它:

<use_mcp_tool> <server_name>github.com/thadius83/mcp-server-openai</server_name> <tool_name>ask-openai</tool_name> <arguments> { "query": "What are the key features of Python's asyncio library?", "model": "o3-mini" // Optional, defaults to o3-mini } </arguments> </use_mcp_tool>

模型比较

  1. o3-mini(默认)
    • 最适合:快速、简洁的答案
    • 风格:直接、高效
    • 响应示例:
      Python's asyncio provides non-blocking, collaborative multitasking. Key features: 1. Event Loop – Schedules and runs asynchronous tasks 2. Coroutines – Functions you can pause and resume 3. Tasks – Run coroutines concurrently 4. Futures – Represent future results 5. Non-blocking I/O – Efficient handling of I/O operations
  2. GPT-4O-迷你
    • 最适合:更全面的解释
    • 风格:细致、透彻
    • 响应示例:
      Python's asyncio library provides a comprehensive framework for asynchronous programming. It includes an event loop for managing tasks, coroutines for writing non-blocking code, tasks for concurrent execution, futures for handling future results, and efficient I/O operations. The library also provides synchronization primitives and high-level APIs for network programming.

响应格式

该工具以标准化格式返回响应:

{ "content": [ { "type": "text", "text": "Response from the model..." } ] }

故障排除

  1. 未找到服务器
    • 验证配置中的 PYTHONPATH 指向正确的目录
    • 确保 Python 和 pip 已正确安装
    • 尝试直接运行python -m src.mcp_server_openai.server --openai-api-key your-key-here来检查错误
  2. 身份验证错误
    • 检查您的 OpenAI API 密钥是否有效
    • 确保密钥在 args 数组中正确传递
    • 验证密钥中没有多余的空格或字符
  3. 模型误差
    • 确认您使用的是受支持的型号(o3-mini 或 gpt-4o-mini)
    • 检查您的查询不为空
    • 确保你没有超出令牌限制

发展

# Install development dependencies pip install -e ".[dev]" # Run tests pytest -v test_openai.py -s

与原版的差异

  • 增加了对 o3-mini 和 gpt-4o-mini 型号的支持
  • 改进的消息格式
  • 删除温度参数以提高兼容性
  • 更新了文档,其中包含详细的使用示例
  • 增加了模型比较和响应示例
  • 增强安装说明
  • 添加了故障排除指南

执照

MIT 许可证

You must be authenticated.

A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

通过 MCP 协议实现与 OpenAI 模型的集成,支持与 Claude Desktop 一起使用时提供简洁、详细的响应。

  1. Cline Auto Install
    1. Features
      1. Installation
        1. Installing via Smithery
        2. Manual Installation
      2. Usage
        1. Model Comparison
        2. Response Format
      3. Troubleshooting
        1. Development
          1. Changes from Original
            1. License

              Related MCP Servers

              • -
                security
                A
                license
                -
                quality
                A Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.
                Last updated -
                1
                24
                28
                JavaScript
                MIT License
                • Apple
              • -
                security
                A
                license
                -
                quality
                A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.
                Last updated -
                18
                Python
                MIT License
                • Apple
              • -
                security
                F
                license
                -
                quality
                A Model Context Protocol server implementation that enables connection between OpenAI APIs and MCP clients for coding assistance with features like CLI interaction, web API integration, and tool-based architecture.
                Last updated -
                9
                Python
                • Linux
                • Apple

              View all related MCP servers

              ID: fzgyk9mw5e