With the OpenAI MCP Server, you can query OpenAI models directly from Claude using the MCP protocol.
Ask OpenAI models questions: Use the
ask-openai
endpoint to interact with GPT-4 or GPT-3.5-turbo modelsCustomize query parameters: Control response length with
max_tokens
(1-4000) and creativity/randomness withtemperature
(0-2)Integration with Claude: Configure the server in Claude Desktop for seamless use of OpenAI models
Local development: Clone, install, and test the server locally for debugging or customization
Allows querying OpenAI models directly from Claude using MCP protocol
OpenAI MCP Server
Query OpenAI models directly from Claude using MCP protocol.
Setup
Add to claude_desktop_config.json
:
Development
Testing
License
MIT License
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Query OpenAI models directly from Claude using MCP protocol.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.Last updated -17565MIT License
- AsecurityAlicenseAqualityEnables integration with OpenAI models through the MCP protocol, supporting concise and detailed responses for use with Claude Desktop.Last updated -4MIT License
- -securityAlicense-qualityA lightweight bridge that wraps OpenAI's built-in tools (like web search and code interpreter) as Model Context Protocol servers, enabling their use with Claude and other MCP-compatible models.Last updated -11MIT License
- -securityFlicense-qualityA Model Context Protocol server that enables Claude users to access specialized OpenAI agents (web search, file search, computer actions) and a multi-agent orchestrator through the MCP protocol.Last updated -9