Adapts the MCP server to work with OpenAI's chat completions and responses API
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@MCP Quickstart Weather Serverwhat's the forecast for New York this weekend?"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
MCP Quickstart
Basic MCP server from the Model Context Protocol (MCP) Quickstart Guide adapted to work with the OpenAI chat completions and responses API's
Notes
MCP configuration
{ "mcpServers": { "weather": { "command": "uv", "args": [ "--directory", "<project_folder>", "run", "weather.py" ] } } }MCP inspector was very helpful in troubleshooting basic configuration issues
npx @modelcontextprotocol/inspectorClaude desktop works fine with this server
I was not able to make this server work in PyCharm > AI Assistant > Model Context Protocol (MCP)
Links:
https://modelcontextprotocol.io/quickstart/server
https://modelcontextprotocol.io/quickstart/client
https://platform.openai.com/docs/api-reference/chat/create
https://platform.openai.com/docs/api-reference/chat/object