HDX MCP Server
A Model Context Protocol (MCP) server that provides AI assistants with seamless access to the Humanitarian Data Exchange (HDX) API. This provides a wide array of humanitarian data as hosted on the Humanitarian Data Exchange
This server automatically generates MCP tools from the HDX OpenAPI specification and includes custom tools for enhanced functionality.
Features
- 🚀 Automatic OpenAPI Integration: Automatically generates MCP tools from the HDX HAPI OpenAPI specification
- 🔧 Custom Tools: Additional hand-crafted tools for common HDX operations
- 🔐 Secure Authentication: Environment-based API key management with proper security practices
- 📡 Dual Transport Support: Both stdio and HTTP transports for local and remote usage
- 🏷️ Smart Categorization: Intelligent tagging and categorization of API endpoints
- 🛡️ Security Best Practices: Following MCP security guidelines with proper input validation
- ✅ Comprehensive Testing: Full test suite with unit and integration tests
- 📋 Extensible Design: Easy to add custom tools alongside auto-generated ones
Caveats
The HDX API offers a very rich source of Humanitarian data, but it is complex. The coverage of data by regions differs per country, as showing in the table here. The server prompt instructs calling LLMs to check data coverage, but it is worth noting that in this release aggregate queries such as 'What's the population of country X' or 'How many conflict events were there last year in country Y' will be challenging for countries which only have data at a granular level (eg Afghanistan conflict events). The LLM would have to sum this data, which of course may be error prone. To resolve this in future work, either the API can provide aggregate values or data needs to be extracted and processed.
Quick Start
Get up and running in 2 minutes:
That's it! The server is now running and ready to serve HDX data via MCP.
Installation
Prerequisites
Setup Instructions
UV is a fast Python package manager that automatically handles virtual environments and dependency management. No need to manually create or activate virtual environments - uv
does this automatically!
- Install UV (if not already installed):
- Clone the repository:
- Install dependencies:
- Configure your environment:Important Environment Variables:
HDX_API_KEY
- Your HDX API key (required)HDX_RATE_LIMIT_REQUESTS
- Max requests per period (default: 10)HDX_RATE_LIMIT_PERIOD
- Rate limit period in seconds (default: 60.0)HDX_TIMEOUT
- Request timeout in seconds (default: 30.0)
Usage
Running the Server
Option 1: Using UV (Development)
Option 2: Using Docker (Recommended for Claude Desktop)
Command Line Options
Using with Claude Desktop
To use this HDX MCP server with Claude Desktop, you need to configure it in Claude's MCP settings.
Setup Steps
- Complete the basic setup above - ensure the server works with
uv run hdx-mcp-server
- Get the absolute path to your project:
- Configure Claude Desktop:Open Claude Desktop settings and add to your MCP servers configuration:Option A: Using Docker (Recommended)Option B: Using UVReplace
/absolute/path/to/your/hdx-mcp2
with the actual path from step 2. - Ensure your
.env
file is configured with your HDX API key: - Restart Claude Desktop to load the new MCP server
Verification
Once configured, you should see the HDX server appear in Claude's MCP servers list. You can test it by asking Claude to:
- "List available HDX tools"
- "Get information about Afghanistan using HDX data"
- "Search for refugee data in Syria"
Troubleshooting
If Claude Desktop can't connect to the server:
- Check the path - ensure the
cwd
path is correct and absolute - Verify uv is available - make sure
uv
is installed and in your PATH - Test standalone - verify
uv run hdx-mcp-server
works from the project directory - Check logs - Claude Desktop logs may show connection errors
- Verify environment - ensure your
.env
file is in the project root with valid HDX API key
Alternative: HTTP Transport
If stdio transport doesn't work, you can also run the server in HTTP mode and configure Claude to connect via HTTP:
- Start the server in HTTP mode:
- Configure Claude Desktop for HTTP:
Available Tools
The server provides tools automatically generated from the HDX HAPI OpenAPI specification, plus custom tools for enhanced functionality.
Auto-Generated Tools (Examples)
All HDX HAPI endpoints are automatically converted to MCP tools:
- Metadata Tools:
get_locations
,get_admin1
,get_admin2
,get_organizations
, etc. - Affected People Tools:
get_humanitarian_needs
,get_idps
,get_refugees
, etc. - Climate Tools:
get_rainfall
- Coordination Tools:
get_conflict_events
,get_funding
,get_operational_presence
, etc. - Food Security Tools:
get_food_prices
,get_food_security
,get_poverty_rate
- Geography Tools:
get_baseline_population
Custom Tools
hdx_server_info
Get information about the HDX MCP server instance.
Returns:
hdx_get_dataset_info
Get detailed information about a specific HDX dataset.
Parameters:
dataset_hdx_id
(string): The HDX dataset identifier
Example:
hdx_search_locations
Search for locations (countries) in the HDX system.
Parameters:
name_pattern
(optional string): Pattern to match location nameshas_hrp
(optional boolean): Filter for locations with Humanitarian Response Plans
Example:
Tool Categories and Tags
Tools are automatically categorized with relevant tags:
- Metadata Tools:
metadata
,reference
- Affected People Tools:
affected-people
,humanitarian
- Climate Tools:
climate
,environmental
- Coordination Tools:
coordination
,humanitarian
- Food Security Tools:
food-security
,nutrition
,poverty
- Geography Tools:
geography
,infrastructure
,population
- Utility Tools:
utility
,system
All tools also include global tags: hdx
, humanitarian
, data
Configuration
Environment Variables
The server is configured via environment variables. Copy env.example
to .env
and customize:
Variable | Required | Default | Description |
---|---|---|---|
HDX_API_KEY | Yes | - | Your HDX API key |
HDX_BASE_URL | No | https://hapi.humdata.org/api/v2 | HDX API base URL |
HDX_OPENAPI_URL | No | https://hapi.humdata.org/openapi.json | OpenAPI spec URL |
HDX_TIMEOUT | No | 30.0 | HTTP request timeout (seconds) |
HDX_APP_NAME | No | hdx-mcp-server | Application name for HDX |
HDX_APP_EMAIL | No | assistant@example.com | Contact email for user of HDX API |
MCP_HOST | No | localhost | Host for HTTP transport |
MCP_PORT | No | 8000 | Port for HTTP transport |
Complete Configuration Example
See env.example
for a complete configuration template with detailed comments:
Excluded Endpoints
The server automatically excludes the following endpoints:
/api/v2/encode_app_identifier
- Internal utility for generating app identifiers
Adding Custom Tools
To add your own tools alongside the auto-generated ones, modify the _register_custom_tools
method in hdx_mcp_server.py
:
Development
Development Setup with UV
Development Commands
Project Structure
Testing
Running Tests
Test Categories
- Unit Tests (
tests/unit/
): Test individual components in isolation - Integration Tests (
tests/integration/
): Test interactions with external APIs - Security Tests (
tests/security/
): Test security features and protections
Test Coverage
The test suite includes:
- Unit Tests: Server initialization, configuration, authentication
- Integration Tests: Real API interaction (when API key available)
- Security Tests: API key protection, input validation, error handling
- Edge Case Tests: Malformed specs, network errors, timeouts
Manual Testing
Test the server manually with curl (HTTP transport):
Security Considerations
This server follows MCP security best practices:
Authentication & Authorization
- ✅ API keys stored in environment variables, never hardcoded
- ✅ API keys not logged or exposed in error messages
- ✅ Proper HTTP client configuration with timeouts
- ✅ Base64 encoding for app identifiers as required by HDX
Input Validation
- ✅ FastMCP handles input schema validation automatically
- ✅ Custom tools include proper type hints and validation
- ✅ Error handling prevents information leakage
Network Security
- ✅ HTTPS-only communication with HDX API
- ✅ Configurable timeouts to prevent hanging connections
- ✅ Proper error handling for network failures
Data Handling
- ✅ No persistent data storage
- ✅ Proper cleanup of HTTP connections
- ✅ Graceful shutdown handling
Troubleshooting
Common Issues
- "Required environment variable HDX_API_KEY is not set"
- Ensure you have copied
.env.example
to.env
- Verify your HDX API key is correctly set in
.env
- Ensure you have copied
- "Failed to load OpenAPI specification"
- Check your internet connection
- Verify HDX API is accessible:
curl https://hapi.humdata.org/openapi.json
- HTTP transport not accessible.
- Check if the port is already in use
- Verify firewall settings if accessing remotely
- Use
--host 0.0.0.0
for external access
- Authentication errors
- Verify your HDX API key is valid
- Check if your account has necessary permissions
Logging
Enable verbose logging for debugging:
This will show detailed information about:
- Server initialization
- OpenAPI spec loading
- Tool registration
- HTTP requests and responses
- Error details
Contributing
Development Setup
- Install dependencies:
- Development workflow:
Code Quality & CI/CD
This project includes comprehensive quality controls:
Code Quality Tools
- Black: Code formatting
- isort: Import sorting
- flake8: Linting with security and complexity extensions
- mypy: Type checking
- bandit: Security vulnerability scanning
- pytest: Testing with coverage reporting
GitHub Actions
- CI Pipeline: Tests on Python 3.8-3.12
- Code Quality: Automated formatting, linting, and type checking
- Security Scanning: Vulnerability detection and security tests
- Integration Tests: Full API integration testing
Pre-commit Hooks
Automatically run quality checks before each commit:
See CONTRIBUTING.md for detailed development guidelines.
Adding New Endpoints
New HDX API endpoints are automatically included when the OpenAPI specification is updated. No code changes are required.
Adding Custom Tools
- Add your tool function in the
_register_custom_tools
method - Include proper type hints and documentation
- Add corresponding tests in
test_hdx_mcp_server.py
- Update this README with documentation for your tool
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Resources
- HDX (Humanitarian Data Exchange)
- HDX HAPI Documentation
- Model Context Protocol Specification
- FastMCP Documentation
Support
For issues related to:
- This MCP server: Open an issue in this repository
- HDX API: Consult HDX HAPI documentation
- MCP protocol: See MCP specification
- FastMCP library: Check FastMCP documentation
Note: This server requires a valid HDX API key. Please ensure you comply with HDX's terms of service and rate limiting guidelines when using this server.
This server cannot be installed
hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Provides AI assistants with seamless access to the Humanitarian Data Exchange (HDX) API for accessing humanitarian datasets, population data, conflict events, food security information, and other critical humanitarian data. Automatically generates MCP tools from the HDX OpenAPI specification with additional custom tools for enhanced functionality.
Related MCP Servers
- -securityAlicense-qualityAn MCP server implementation that standardizes how AI applications access tools and context, providing a central hub that manages tool discovery, execution, and context management with a simplified configuration system.Last updated -13MIT License
- AsecurityFlicenseAqualityAn MCP server that integrates Apifox API documentation with AI assistants, allowing AI to extract and understand API information from Apifox projects.Last updated -228
- -securityFlicense-qualityThis server enables interacting with the National Digital Health Mission's Health Information User (HIU) API, allowing agents to access and manage health information through the Multi-Agent Conversation Protocol.Last updated -
- -securityFlicense-qualityA service that converts OpenAPI specifications into MCP tools, enabling AI assistants to interact with your API endpoints through natural language.Last updated -