-
securityA
license-
qualityA Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.
Last updated -
1
24
28
JavaScript
MIT License
With the OpenAI MCP Server, you can query OpenAI models directly from Claude using the MCP protocol.
ask-openai
endpoint to interact with GPT-4 or GPT-3.5-turbo modelsmax_tokens
(1-4000) and creativity/randomness with temperature
(0-2)Allows querying OpenAI models directly from Claude using MCP protocol
Query OpenAI models directly from Claude using MCP protocol.
Add to claude_desktop_config.json
:
MIT License
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Query OpenAI models directly from Claude using MCP protocol.
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/pierrebrunelle/mcp-server-openai'
If you have feedback or need assistance with the MCP directory API, please join our Discord server