# Complete Project Structure
```
mcp-server-project/
├── mcp_server.py # FastAPI MCP server with SSE support
├── client.py # Azure OpenAI GPT-4o client
├── requirements.txt # Python dependencies
├── Dockerfile # Docker container configuration
├── docker-compose.yml # Docker orchestration
├── .env.example # Environment variables template
├── setup.sh # Setup and utility script
└── README.md # Complete documentation
```
## 📂 File Overview
### Core Files
- **`mcp_server.py`**: Main MCP server implementing the MCP protocol with FastAPI and SSE
- **`client.py`**: GPT-4o client that connects to the MCP server for tool usage
- **`requirements.txt`**: All Python dependencies needed for the project
### Configuration Files
- **`Dockerfile`**: Containerizes the application for easy deployment
- **`docker-compose.yml`**: Orchestrates the MCP server and client services
- **`.env.example`**: Template for Azure OpenAI configuration
### Utility Files
- **`setup.sh`**: Convenience script for common operations
- **`README.md`**: Complete documentation and usage guide
## 🚀 Quick Start Commands
```bash
# 1. Setup environment
cp .env.example .env
# Edit .env with your Azure OpenAI credentials
# 2. Start MCP server
./setup.sh start
# 3. Run GPT-4o client
./setup.sh client
# 4. Test the server
./setup.sh test
```
## 🔄 How It Works
1. **MCP Server** (`mcp_server.py`) runs on port 8000 and provides:
- Calculator tool for math operations
- Weather tool for location-based data
- Time tool for current timestamps
2. **GPT-4o Client** (`client.py`) connects to Azure OpenAI and:
- Converts MCP tools to OpenAI function format
- Handles tool calls from GPT-4o
- Forwards requests to MCP server via SSE
- Returns results back to GPT-4o for final response
3. **Communication Flow**:
```
User Question → GPT-4o → Tool Call → MCP Server → Tool Result → GPT-4o → Final Answer
```
## 🛠️ Key Features
- **MCP Protocol Compliance**: Fully implements MCP 2024-11-05 specification
- **Streaming Support**: Uses Server-Sent Events for real-time communication
- **Tool Integration**: Seamless tool calling between GPT-4o and MCP server
- **Docker Ready**: Complete containerization for easy deployment
- **Error Handling**: Comprehensive error handling and logging
- **Extensible**: Easy to add new tools and capabilities
This implementation provides a solid foundation for building more complex MCP-based applications with Azure OpenAI integration.
MCP directory API
We provide all the information about MCP servers via our MCP API.
curl -X GET 'https://glama.ai/api/mcp/v1/servers/tevinric/mcp-protocol-server'
If you have feedback or need assistance with the MCP directory API, please join our Discord server