Provides containerized deployment options using Docker and Docker Compose for easy setup and operation of the MCP server.
Implements a streamable HTTP MCP server using FastAPI with Server-Sent Events (SSE) support for real-time communication and response streaming.
Integrates with Azure OpenAI GPT-4o for intelligent tool usage, enabling capabilities like calculations, weather queries, and time-related functions through tool calling features.
Click on "Install Server".
Wait a few minutes for the server to deploy. Once ready, it will show a "Started" state.
In the chat, type
@followed by the MCP server name and your instructions, e.g., "@Streamable HTTP MCP Servercalculate the total cost for 5 items at $12.99 each"
That's it! The server will respond to your query, and you can continue using it as needed.
Here is a step-by-step guide with screenshots.
Streamable HTTP MCP Server with Azure OpenAI GPT-4o
This project implements a Streamable HTTP MCP (Model Context Protocol) Server using FastAPI and integrates it with Azure OpenAI GPT-4o for intelligent tool usage.
π Features
MCP Server: Streamable HTTP server with SSE support
Azure OpenAI Integration: GPT-4o with tool calling capabilities
Simple Tools: Calculator, Weather (mock), and Time tools
Docker Setup: Easy deployment with docker-compose
Real-time Communication: Server-Sent Events (SSE) for streaming responses
Related MCP server: Azure DevOps MCP Server
π Prerequisites
Docker and Docker Compose
Azure OpenAI account with GPT-4o deployment
Python 3.11+ (for local development)
π οΈ Quick Setup
Clone and setup:
Configure Azure OpenAI:
Start the MCP server:
Run the GPT-4o client:
π Usage Examples
Basic Tool Usage
The client automatically demonstrates various tool interactions:
Complex Multi-tool Usage
π§ Available Tools
Calculator: Evaluate mathematical expressions
Weather: Get mock weather data for any location
Time: Get current timestamp
ποΈ Architecture
π API Endpoints
POST /sse- Main MCP communication endpointGET /health- Health checkGET /tools- List available tools
π Manual Testing
Test the MCP server directly:
π³ Docker Commands
π§ͺ Development
Local Development
Adding New Tools
Create a new tool class in
mcp_server.pyAdd tool definition to
TOOLSdictionaryAdd handler in
MCPHandler.handle_tools_call
Example:
π Troubleshooting
Common Issues
Connection refused: Make sure MCP server is running on port 8000
Authentication errors: Check your Azure OpenAI credentials in
.envTool call failures: Check MCP server logs for detailed error messages
Debug Mode
Enable debug logging:
π Security Notes
Never commit your
.envfile with real credentialsUse environment variables in production
Consider adding authentication for production deployments
π License
MIT License - feel free to use and modify as needed.