Skip to main content
Glama

Humanitarian Negotiation MCP Server

UNIVERSAL_DEPLOYMENT_GUIDE.md9.56 kB
# Universal Deployment Guide ## 🌍 Making Your MCP Available to Any LLM This guide explains how your Humanitarian Negotiation MCP is **already universal** and how to deploy it for maximum reach. --- ## ✅ Current Status: Already Universal Your MCP uses **FastMCP**, which is part of the MCP standard maintained by Anthropic. This means it's **not tied to any specific LLM**. It can be used with: - ✅ Claude Desktop - ✅ Cline (VSCode) - ✅ Continue IDE - ✅ Cursor IDE - ✅ Any HTTP client - ✅ OpenAI, Anthropic, Mistral APIs - ✅ Custom LLM applications --- ## 🚀 Deployment Options ### Option 1: MCP Standard (Already Configured) **Pros:** - Already set up - Native integration with MCP clients - Zero additional dependencies - Works with all major IDEs **Setup for new users:** ```json { "mcpServers": { "humanitarian-negotiation": { "command": "python", "args": ["/path/to/humanitarian_negotiation_mcp.py"] } } } ``` **Works with:** - Claude Desktop - Cline - Continue - Cursor - Any MCP-compatible client --- ### Option 2: HTTP API (Recommended for Maximum Universality) **Perfect for:** - Web applications - Mobile apps - Cross-platform access - Any programming language - Cloud deployment **Installation:** ```bash # Install additional dependencies pip install fastapi uvicorn # Run HTTP server python http_server.py ``` **Access points:** - API: `http://localhost:8000` - Docs: `http://localhost:8000/docs` - Tools: `http://localhost:8000/tools` **Usage examples:** ```bash # cURL curl -X POST http://localhost:8000/api/v1/island-of-agreement \ -H "Content-Type: application/json" \ -d '{"situation_description":"...", "organization_name":"...","counterpart_name":"..."}' # Python import requests requests.post('http://localhost:8000/api/v1/island-of-agreement', json={...}) # JavaScript fetch('http://localhost:8000/api/v1/island-of-agreement', {method: 'POST', body: JSON.stringify({...})}) ``` --- ### Option 3: Docker Container **Perfect for:** - Production deployment - Cloud services - Consistent environments - Team sharing **Build:** ```bash docker build -t humanitarian-mcp:1.0 . ``` **Run:** ```bash # Single container docker run -p 8000:8000 humanitarian-mcp:1.0 # With docker-compose docker-compose up -d # Access at http://localhost:8000 ``` --- ### Option 4: Python Package (PyPI) **Makes it easiest for Python developers:** ```bash # Install pip install humanitarian-negotiation-mcp # Or with HTTP support pip install humanitarian-negotiation-mcp[http] # Run humanitarian-mcp # MCP server humanitarian-mcp-http # HTTP server ``` **Currently:** Configured and ready - just needs to be published to PyPI. --- ### Option 5: Cloud Deployment Deploy the HTTP server on any cloud platform: **AWS Lambda (Serverless):** ```bash # Uses SAM or Zappa pip install zappa zappa init zappa deploy production ``` **Google Cloud Run:** ```bash gcloud run deploy humanitarian-mcp \ --source . \ --platform managed \ --region us-central1 ``` **Azure Container Instances:** ```bash az container create \ --resource-group myResourceGroup \ --name humanitarian-mcp \ --image humanitarian-mcp:1.0 \ --ports 8000 ``` **Heroku (Deprecated, but still works):** ```bash git push heroku main ``` --- ## 📋 How to Distribute Your MCP ### For Individual Users: 1. Share the entire directory 2. They configure it in their MCP client 3. Or they run the HTTP server ### For Organizations: 1. Publish on PyPI: `pip install humanitarian-negotiation-mcp` 2. Deploy Docker image to internal registry 3. Host HTTP API on internal server 4. Document in Confluence/Wiki ### For Open Source Community: 1. Create GitHub repository 2. Publish to PyPI 3. Create Docker image on Docker Hub 4. Share on MCP Hub (when launched) 5. Document in GitHub README --- ## 🔧 Step-by-Step: From Local to Universal ### Step 1: Verify Local Installation ✅ (Done) ```bash python humanitarian_negotiation_mcp.py # Should print: Listening for MCP messages on stdio ``` ### Step 2: Enable HTTP Access (10 minutes) ```bash pip install fastapi uvicorn python http_server.py # Visit http://localhost:8000/docs ``` ### Step 3: Containerize (15 minutes) ```bash docker build -t humanitarian-mcp:1.0 . docker run -p 8000:8000 humanitarian-mcp:1.0 ``` ### Step 4: Package for Distribution (20 minutes) ```bash python -m pip install --upgrade build twine python -m build # Files in dist/ ready to upload ``` ### Step 5: Publish (5 minutes) ```bash # Test PyPI first twine upload --repository testpypi dist/* # Then PyPI twine upload dist/* ``` ### Step 6: Deploy to Cloud (varies) Choose your platform and follow their deployment guide. --- ## 📊 Feature Comparison: How to Access | Access Method | Setup Time | Complexity | Universality | Scalability | |--------------|-----------|-----------|-------------|------------| | MCP Server | Instant | Very Low | LLMs only | Low | | HTTP API | 5 min | Low | Universal | High | | Docker | 10 min | Low | Very High | Very High | | PyPI Package | 30 min | Medium | High | High | | Cloud Deploy | 30 min | Medium | Very High | Very High | --- ## 🎯 Recommended Deployment Path ### For Maximum Immediate Reach: 1. ✅ **Now**: MCP server (already done) 2. **This week**: HTTP API (10 min setup) 3. **Next month**: Docker image 4. **This quarter**: PyPI package + GitHub ### For Production Use: 1. HTTP API (proven and scalable) 2. Containerized (reproducible) 3. Cloud hosted (always available) 4. CDN for API responses (if needed) ### For Developer Community: 1. PyPI package (easy install) 2. GitHub repository (contributions) 3. Documentation (tutorials) 4. Examples (real-world usage) --- ## 📦 Files Created for Universal Access | File | Purpose | Status | |------|---------|--------| | `humanitarian_negotiation_mcp.py` | Core MCP server | ✅ Complete | | `http_server.py` | REST API wrapper | ✅ Complete | | `Dockerfile` | Container image | ✅ Complete | | `.dockerignore` | Container optimization | ✅ Complete | | `docker-compose.yml` | Multi-service setup | ✅ Complete | | `pyproject.toml` | Modern Python packaging | ✅ Complete | | `setup_package.py` | Classic packaging | ✅ Complete | | `INTEGRATION_EXAMPLES.md` | Usage examples | ✅ Complete | | `DEPLOY_AS_UNIVERSAL_MCP.md` | Technical guide | ✅ Complete | --- ## 🚀 Quick Launch Commands ### Start MCP Server ```bash python humanitarian_negotiation_mcp.py ``` ### Start HTTP Server ```bash pip install fastapi uvicorn python http_server.py ``` ### Start with Docker ```bash docker-compose up -d ``` ### Test Everything ```bash # Terminal 1: Start server python http_server.py # Terminal 2: Test API curl http://localhost:8000/health curl http://localhost:8000/tools ``` --- ## 📚 Documentation Files - **README.md** - Core features and installation - **QUICKSTART.md** - 5-minute quick start - **EXAMPLES.md** - Real-world usage examples - **DEPLOY_AS_UNIVERSAL_MCP.md** - Technical deployment details - **INTEGRATION_EXAMPLES.md** - How to use with different clients - **UNIVERSAL_DEPLOYMENT_GUIDE.md** - This file --- ## ✨ Your MCP is Already Universal The key insight: **Your MCP is not tied to any LLM.** Because you're using FastMCP (the MCP standard), it works with: - ✅ Any MCP-compatible client - ✅ Any HTTP library - ✅ Any programming language - ✅ Any LLM that supports tools - ✅ Web, mobile, desktop, CLI apps The deployment options above are just different **ways to distribute** and **expose** the same core functionality. --- ## 🎓 Next Steps 1. **Test HTTP Server** (5 min) ```bash python http_server.py # Visit http://localhost:8000/docs ``` 2. **Build Docker Image** (10 min) ```bash docker build -t humanitarian-mcp:1.0 . ``` 3. **Setup PyPI Publishing** (if sharing publicly) ```bash # Update setup.py with your details # Then: twine upload dist/* ``` 4. **Deploy to Cloud** (optional, 30 min) - Choose platform (AWS/GCP/Azure/etc) - Follow platform's deployment guide - Share public URL with users --- ## 💡 Tips for Maximum Distribution 1. **Document thoroughly** - People need to know how to use it 2. **Provide examples** - Show real usage patterns (we did this!) 3. **Make it easy** - One-command setup (we did this!) 4. **Support multiple methods** - MCP, HTTP, Docker, PyPI (we did this!) 5. **Keep it maintained** - Update docs and dependencies regularly 6. **Get feedback** - Listen to users and improve --- ## 🤔 Common Questions **Q: Is my MCP truly universal?** A: Yes! It uses standard MCP protocol, which is LLM-agnostic. **Q: Do I need to do anything special?** A: No, but HTTP API makes it more accessible to non-technical users. **Q: What's the best deployment method?** A: HTTP API for maximum reach, Docker for consistency, PyPI for developers. **Q: Can users run it locally?** A: Yes, all deployment methods support local execution. **Q: What if I want to add more features?** A: Just edit `humanitarian_negotiation_mcp.py` - all deployment methods automatically include updates. --- ## 🎉 Conclusion Your Humanitarian Negotiation MCP is: - ✅ **Already universal** (uses standard MCP) - ✅ **Ready for any LLM** (tools-based design) - ✅ **Easy to distribute** (multiple deployment options) - ✅ **Scalable** (HTTP API for production) - ✅ **Well-documented** (comprehensive guides) Now it's just a matter of **choosing how to share it** based on your audience and requirements!

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/escenariosparalatransformacion/humanitarian-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server