OpenAI MCP Server
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Integrations
Allows querying OpenAI models (o3-mini and gpt-4o-mini) directly from Claude using the MCP protocol, enabling users to ask questions and receive responses from OpenAI's AI models
OpenAI MCP Server
Query OpenAI models directly from Claude using MCP protocol. This fork adds support for o3-mini and gpt-4o-mini models with improved message handling.
Cline Auto Install
Features
- Direct integration with OpenAI's API
- Support for multiple models:
- o3-mini (default): Optimized for concise responses
- gpt-4o-mini: Enhanced model for more detailed responses
- Configurable message formatting
- Error handling and logging
- Simple interface through MCP protocol
Installation
Installing via Smithery
To install OpenAI MCP Server for Claude Desktop automatically via Smithery:
Manual Installation
- Clone the Repository:
- Configure Claude Desktop:
Add this server to your existing MCP settings configuration. Note: Keep any existing MCP servers in the configuration - just add this one alongside them.
Location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%/Claude/claude_desktop_config.json
- Linux: Check your home directory (
~/
) for the default MCP settings location
- Get an OpenAI API Key:
- Visit OpenAI's website
- Create an account or log in
- Navigate to API settings
- Generate a new API key
- Add the key to your configuration file as shown above
- Restart Claude:
- After updating the configuration, restart Claude for the changes to take effect
Usage
The server provides a single tool ask-openai
that can be used to query OpenAI models. You can use it directly in Claude with the use_mcp_tool command:
Model Comparison
- o3-mini (default)
- Best for: Quick, concise answers
- Style: Direct and efficient
- Example response:Copy
- gpt-4o-mini
- Best for: More comprehensive explanations
- Style: Detailed and thorough
- Example response:Copy
Response Format
The tool returns responses in a standardized format:
Troubleshooting
- Server Not Found:
- Verify the PYTHONPATH in your configuration points to the correct directory
- Ensure Python and pip are properly installed
- Try running
python -m src.mcp_server_openai.server --openai-api-key your-key-here
directly to check for errors
- Authentication Errors:
- Check that your OpenAI API key is valid
- Ensure the key is correctly passed in the args array
- Verify there are no extra spaces or characters in the key
- Model Errors:
- Confirm you're using supported models (o3-mini or gpt-4o-mini)
- Check your query isn't empty
- Ensure you're not exceeding token limits
Development
Changes from Original
- Added support for o3-mini and gpt-4o-mini models
- Improved message formatting
- Removed temperature parameter for better compatibility
- Updated documentation with detailed usage examples
- Added model comparison and response examples
- Enhanced installation instructions
- Added troubleshooting guide
License
MIT License
You must be authenticated.
Enables integration with OpenAI models through the MCP protocol, supporting concise and detailed responses for use with Claude Desktop.