With the OpenAI MCP Server, you can query OpenAI models directly from Claude using the MCP protocol.
Ask OpenAI models questions: Use the
ask-openaiendpoint to interact with GPT-4 or GPT-3.5-turbo modelsCustomize query parameters: Control response length with
max_tokens(1-4000) and creativity/randomness withtemperature(0-2)Integration with Claude: Configure the server in Claude Desktop for seamless use of OpenAI models
Local development: Clone, install, and test the server locally for debugging or customization
Allows querying OpenAI models directly from Claude using MCP protocol
OpenAI MCP Server
Query OpenAI models directly from Claude using MCP protocol.

Setup
Add to claude_desktop_config.json:
Development
Testing
License
MIT License
Related MCP Servers
- AsecurityAlicenseAqualityA Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.Last updated -110668MIT License
- AsecurityAlicenseAqualityEnables integration with OpenAI models through the MCP protocol, supporting concise and detailed responses for use with Claude Desktop.Last updated -4MIT License
- AsecurityAlicenseAqualityA lightweight bridge that wraps OpenAI's built-in tools (like web search and code interpreter) as Model Context Protocol servers, enabling their use with Claude and other MCP-compatible models.Last updated -411MIT License
- AsecurityFlicenseAqualityA Model Context Protocol server that enables Claude users to access specialized OpenAI agents (web search, file search, computer actions) and a multi-agent orchestrator through the MCP protocol.Last updated -410