Integrations
MCP Server with Docker
This project demonstrates how to integrate the Model Control Protocol (MCP) with OpenAI's API, enabling OpenAI to access and use tools exposed by an MCP server running in Docker.
Prerequisites
- Docker installed on your system
- Git (to clone the repository)
Project Structure
server.py
: The MCP server implementation with a toolclient.py
: A client that connects to the server and calls the agentDockerfile
: Instructions for building the Docker imagerequirements.txt
: Python dependencies for the project
Data Flow Explanation
- User Query: The user sends a query to the system (e.g., "What is our company's vacation policy?")
- OpenAI API: OpenAI receives the query and available tools from the MCP server
- Tool Selection: OpenAI decides which tools to use based on the query
- MCP Client: The client receives OpenAI's tool call request and forwards it to the MCP server
- MCP Server: The server executes the requested tool (e.g., retrieving knowledge base data)
- Response Flow: The tool result flows back through the MCP client to OpenAI
- Final Response: OpenAI generates a final response incorporating the tool data
Running with Docker
Step 1: Build the Docker image
Copy
Step 2: Run the Docker container
Copy
This will start the MCP server inside a Docker container and expose it on port 8050.
Running the Client
Once the server is running, you can run the client in a separate terminal:
Copy
The client will connect to the server, list available tools, and call the agent to answer the query.
Troubleshooting
If you encounter connection issues:
- Check if the server is running: Make sure the Docker container is running with
docker ps
. - Verify port mapping: Ensure the port is correctly mapped with
docker ps
or by checking the output of thedocker run
command. - Check server logs: View the server logs with
docker logs <container_id>
to see if there are any errors. - Host binding: The server is configured to bind to
0.0.0.0
instead of127.0.0.1
to make it accessible from outside the container. If you're still having issues, you might need to check your firewall settings. - Network issues: If you're running Docker on a remote machine, make sure the port is accessible from your client machine.
Notes
- The server is configured to use SSE (Server-Sent Events) transport and listens on port 8050.
- The client connects to the server at
http://localhost:8050/sse
. - Make sure the server is running before starting the client.
This server cannot be installed
A project that integrates Model Control Protocol with OpenAI's API, allowing OpenAI to access and utilize tools exposed by a dockerized MCP server.
Related MCP Servers
- -securityAlicense-qualityA Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.Last updated -12428JavaScriptMIT License
- -securityAlicense-qualityA simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.Last updated -18PythonMIT License
- AsecurityAlicenseAqualityEnables integration with OpenAI models through the MCP protocol, supporting concise and detailed responses for use with Claude Desktop.Last updated -11PythonMIT License
- -securityFlicense-qualityA Model Context Protocol server implementation that enables connection between OpenAI APIs and MCP clients for coding assistance with features like CLI interaction, web API integration, and tool-based architecture.Last updated -9Python