Skip to main content
Glama

Streamable HTTP MCP Server

by tevinric

Streamable HTTP MCP Server with Azure OpenAI GPT-4o

This project implements a Streamable HTTP MCP (Model Context Protocol) Server using FastAPI and integrates it with Azure OpenAI GPT-4o for intelligent tool usage.

🚀 Features

  • MCP Server: Streamable HTTP server with SSE support
  • Azure OpenAI Integration: GPT-4o with tool calling capabilities
  • Simple Tools: Calculator, Weather (mock), and Time tools
  • Docker Setup: Easy deployment with docker-compose
  • Real-time Communication: Server-Sent Events (SSE) for streaming responses

📋 Prerequisites

  • Docker and Docker Compose
  • Azure OpenAI account with GPT-4o deployment
  • Python 3.11+ (for local development)

🛠️ Quick Setup

  1. Clone and setup:
# Copy environment file cp .env.example .env # Edit .env with your Azure OpenAI credentials nano .env
  1. Configure Azure OpenAI:
AZURE_OPENAI_API_KEY=your_api_key_here AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/ AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4o AZURE_OPENAI_API_VERSION=2024-02-01
  1. Start the MCP server:
./setup.sh start
  1. Run the GPT-4o client:
./setup.sh client

📖 Usage Examples

Basic Tool Usage

The client automatically demonstrates various tool interactions:

Query: What's the current time? Response: The current time is 2024-01-15T14:30:45.123456 Query: Calculate 15 * 42 + 33 Response: The result is 663 Query: What's the weather like in New York? Response: The weather in New York is currently sunny with a temperature of 22°C...

Complex Multi-tool Usage

Query: Can you get the weather for London and then calculate the percentage if the temperature was 20 degrees and now it's 25 degrees? Response: The weather in London is currently 22°C and sunny... The percentage increase from 20°C to 25°C is 25%.

🔧 Available Tools

  1. Calculator: Evaluate mathematical expressions
  2. Weather: Get mock weather data for any location
  3. Time: Get current timestamp

🏗️ Architecture

[GPT-4o Client] <--HTTP--> [MCP Server] <--SSE--> [Tools] | | | [Calculator] | [Weather] | [Time]

🌐 API Endpoints

  • POST /sse - Main MCP communication endpoint
  • GET /health - Health check
  • GET /tools - List available tools

📝 Manual Testing

Test the MCP server directly:

# Health check curl http://localhost:8000/health # List tools curl http://localhost:8000/tools # Test SSE endpoint curl -X POST http://localhost:8000/sse \ -H "Content-Type: application/json" \ -d '{"jsonrpc": "2.0", "id": "1", "method": "tools/list", "params": {}}'

🐳 Docker Commands

# Start MCP server only docker-compose up -d mcp-server # Run client once docker-compose run --rm client # View logs docker-compose logs -f mcp-server # Stop everything docker-compose down

🧪 Development

Local Development

# Install dependencies pip install -r requirements.txt # Run server locally python mcp_server.py # Run client locally (in another terminal) source .env python client.py

Adding New Tools

  1. Create a new tool class in mcp_server.py
  2. Add tool definition to TOOLS dictionary
  3. Add handler in MCPHandler.handle_tools_call

Example:

class NewTool: @staticmethod def do_something(param: str) -> Dict[str, Any]: return {"result": f"Processed: {param}"} # Add to TOOLS dictionary TOOLS["new_tool"] = { "name": "new_tool", "description": "Does something useful", "inputSchema": { "type": "object", "properties": { "param": {"type": "string", "description": "Input parameter"} }, "required": ["param"] } }

🐛 Troubleshooting

Common Issues

  1. Connection refused: Make sure MCP server is running on port 8000
  2. Authentication errors: Check your Azure OpenAI credentials in .env
  3. Tool call failures: Check MCP server logs for detailed error messages

Debug Mode

Enable debug logging:

docker-compose logs -f mcp-server

🔒 Security Notes

  • Never commit your .env file with real credentials
  • Use environment variables in production
  • Consider adding authentication for production deployments

📄 License

MIT License - feel free to use and modify as needed.

-
security - not tested
F
license - not found
-
quality - not tested

Implements a Model Context Protocol server that enables streaming communication between Azure OpenAI GPT-4o and tool services, allowing for real-time intelligent tool usage via Server-Sent Events.

  1. 🚀 Features
    1. 📋 Prerequisites
      1. 🛠️ Quick Setup
        1. 📖 Usage Examples
          1. Basic Tool Usage
          2. Complex Multi-tool Usage
        2. 🔧 Available Tools
          1. 🏗️ Architecture
            1. 🌐 API Endpoints
              1. 📝 Manual Testing
                1. 🐳 Docker Commands
                  1. 🧪 Development
                    1. Local Development
                    2. Adding New Tools
                  2. 🐛 Troubleshooting
                    1. Common Issues
                    2. Debug Mode
                  3. 🔒 Security Notes
                    1. 📄 License

                      Related MCP Servers

                      • A
                        security
                        A
                        license
                        A
                        quality
                        This server provides a convenient API for interacting with Azure DevOps services, enabling AI assistants and other tools to manage work items, code repositories, boards, sprints, and more. Built with the Model Context Protocol, it provides a standardized interface for communicating with Azure DevOps
                        Last updated -
                        96
                        43
                        43
                        TypeScript
                        MIT License
                      • A
                        security
                        A
                        license
                        A
                        quality
                        A Model Context Protocol server that enables AI assistants to interact with Azure DevOps resources including projects, work items, repositories, pull requests, branches, and pipelines through a standardized protocol.
                        Last updated -
                        15
                        1,436
                        277
                        TypeScript
                        MIT License
                        • Linux
                        • Apple
                      • -
                        security
                        F
                        license
                        -
                        quality
                        A reference server implementation for the Model Context Protocol that enables AI assistants to interact with Azure DevOps resources and perform operations such as project management, work item tracking, repository operations, and code search programmatically.
                        Last updated -
                        7
                        TypeScript
                      • A
                        security
                        A
                        license
                        A
                        quality
                        A Model Context Protocol server that enables AI assistants to interact with Azure DevOps services, allowing users to query work items with plans to support creating/updating items, managing pipelines, handling pull requests, and administering sprints and branch policies.
                        Last updated -
                        9
                        71
                        Python
                        MIT License
                        • Linux
                        • Apple

                      View all related MCP servers

                      MCP directory API

                      We provide all the information about MCP servers via our MCP API.

                      curl -X GET 'https://glama.ai/api/mcp/v1/servers/tevinric/mcp-protocol-server'

                      If you have feedback or need assistance with the MCP directory API, please join our Discord server