Skip to main content
Glama

HDX MCP Server

by dividor
MIT License
  • Apple
  • Linux

HDX MCP Server

A Model Context Protocol (MCP) server that provides AI assistants with seamless access to the Humanitarian Data Exchange (HDX) API. This provides a wide array of humanitarian data as hosted on the Humanitarian Data Exchange

This server automatically generates MCP tools from the HDX OpenAPI specification and includes custom tools for enhanced functionality.

Features

  • 🚀 Automatic OpenAPI Integration: Automatically generates MCP tools from the HDX HAPI OpenAPI specification
  • 🔧 Custom Tools: Additional hand-crafted tools for common HDX operations
  • 🔐 Secure Authentication: Environment-based API key management with proper security practices
  • 📡 Dual Transport Support: Both stdio and HTTP transports for local and remote usage
  • 🏷️ Smart Categorization: Intelligent tagging and categorization of API endpoints
  • 🛡️ Security Best Practices: Following MCP security guidelines with proper input validation
  • ✅ Comprehensive Testing: Full test suite with unit and integration tests
  • 📋 Extensible Design: Easy to add custom tools alongside auto-generated ones

Caveats

The HDX API offers a very rich source of Humanitarian data, but it is complex. The coverage of data by regions differs per country, as showing in the table here. The server prompt instructs calling LLMs to check data coverage, but it is worth noting that in this release aggregate queries such as 'What's the population of country X' or 'How many conflict events were there last year in country Y' will be challenging for countries which only have data at a granular level (eg Afghanistan conflict events). The LLM would have to sum this data, which of course may be error prone. To resolve this in future work, either the API can provide aggregate values or data needs to be extracted and processed.

Quick Start

Get up and running in 2 minutes:

# 1. Install UV (if not already installed) curl -LsSf https://astral.sh/uv/install.sh | sh # 2. Clone and setup git clone <repository-url> cd hdx-mcp2 uv sync --all-extras --dev # 3. Configure your HDX API key cp env.example .env nano .env # Edit HDX_API_KEY and HDX_APP_EMAIL # 4. Run the server uv run hdx-mcp-server

That's it! The server is now running and ready to serve HDX data via MCP.

Installation

Prerequisites

  • Python 3.11 or higher
  • UV package manager
  • HDX API key (obtain from HDX)

Setup Instructions

UV is a fast Python package manager that automatically handles virtual environments and dependency management. No need to manually create or activate virtual environments - uv does this automatically!

  1. Install UV (if not already installed):
    # On macOS and Linux curl -LsSf https://astral.sh/uv/install.sh | sh # On Windows powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
  2. Clone the repository:
    git clone <repository-url> cd hdx-mcp2
  3. Install dependencies:
    # Install all dependencies including development tools uv sync --all-extras # Or install only production dependencies uv sync
  4. Configure your environment:
    # Copy environment template cp env.example .env # Edit .env with your HDX API key and settings nano .env
    Important Environment Variables:
    • HDX_API_KEY - Your HDX API key (required)
    • HDX_RATE_LIMIT_REQUESTS - Max requests per period (default: 10)
    • HDX_RATE_LIMIT_PERIOD - Rate limit period in seconds (default: 60.0)
    • HDX_TIMEOUT - Request timeout in seconds (default: 30.0)

Usage

Running the Server

Option 1: Using UV (Development)
# Stdio transport (default) - for local MCP clients uv run hdx-mcp-server # HTTP transport - for remote access uv run hdx-mcp-server --transport http # With verbose logging uv run hdx-mcp-server --verbose # HTTP with custom configuration uv run hdx-mcp-server --transport http --host 0.0.0.0 --port 9000 --verbose
# Build the Docker image docker build -t hdx-mcp-server . # Run with stdio transport (for MCP clients) docker run --rm -i --env-file .env hdx-mcp-server # Run with HTTP transport docker run --rm -p 8000:8000 --env-file .env hdx-mcp-server --transport http --host 0.0.0.0 # Using docker-compose (includes proper volume mounts) docker-compose up

Command Line Options

usage: uv run hdx-mcp-server [-h] [--transport {stdio,http}] [--host HOST] [--port PORT] [--verbose] HDX MCP Server - Humanitarian Data Exchange API Server options: -h, --help show this help message and exit --transport {stdio,http} Transport method to use (default: stdio) --host HOST Host to bind HTTP server to (default: localhost) --port PORT Port to bind HTTP server to (default: 8000) --verbose, -v Enable verbose logging Examples: uv run hdx-mcp-server # Run with stdio transport uv run hdx-mcp-server --transport http # Run with HTTP transport uv run hdx-mcp-server --transport http --port 9000 # HTTP on custom port

Using with Claude Desktop

To use this HDX MCP server with Claude Desktop, you need to configure it in Claude's MCP settings.

Setup Steps

  1. Complete the basic setup above - ensure the server works with uv run hdx-mcp-server
  2. Get the absolute path to your project:
    cd /path/to/your/hdx-mcp2 pwd # Copy this path
  3. Configure Claude Desktop:Open Claude Desktop settings and add to your MCP servers configuration:Option A: Using Docker (Recommended)
    { "hdx-humanitarian-data": { "command": "docker", "args": ["run", "--rm", "-i", "--env-file", "/absolute/path/to/your/hdx-mcp2/.env", "hdx-mcp-server"] } }
    Option B: Using UV
    { "hdx-humanitarian-data": { "command": "uv", "args": ["run", "hdx-mcp-server"], "cwd": "/absolute/path/to/your/hdx-mcp2" } }
    Replace /absolute/path/to/your/hdx-mcp2 with the actual path from step 2.
  4. Ensure your .env file is configured with your HDX API key:
    HDX_API_KEY=your_hdx_api_key_here HDX_BASE_URL=https://hapi.humdata.org
  5. Restart Claude Desktop to load the new MCP server

Verification

Once configured, you should see the HDX server appear in Claude's MCP servers list. You can test it by asking Claude to:

  • "List available HDX tools"
  • "Get information about Afghanistan using HDX data"
  • "Search for refugee data in Syria"

Troubleshooting

If Claude Desktop can't connect to the server:

  1. Check the path - ensure the cwd path is correct and absolute
  2. Verify uv is available - make sure uv is installed and in your PATH
  3. Test standalone - verify uv run hdx-mcp-server works from the project directory
  4. Check logs - Claude Desktop logs may show connection errors
  5. Verify environment - ensure your .env file is in the project root with valid HDX API key

Alternative: HTTP Transport

If stdio transport doesn't work, you can also run the server in HTTP mode and configure Claude to connect via HTTP:

  1. Start the server in HTTP mode:
    uv run hdx-mcp-server --transport http --host 127.0.0.1 --port 9001
  2. Configure Claude Desktop for HTTP:
    { "hdx-humanitarian-data": { "url": "http://127.0.0.1:9001/mcp" } }

Available Tools

The server provides tools automatically generated from the HDX HAPI OpenAPI specification, plus custom tools for enhanced functionality.

Auto-Generated Tools (Examples)

All HDX HAPI endpoints are automatically converted to MCP tools:

  • Metadata Tools: get_locations, get_admin1, get_admin2, get_organizations, etc.
  • Affected People Tools: get_humanitarian_needs, get_idps, get_refugees, etc.
  • Climate Tools: get_rainfall
  • Coordination Tools: get_conflict_events, get_funding, get_operational_presence, etc.
  • Food Security Tools: get_food_prices, get_food_security, get_poverty_rate
  • Geography Tools: get_baseline_population

Custom Tools

hdx_server_info

Get information about the HDX MCP server instance.

Returns:

{ "server_name": "HDX MCP Server", "version": "1.0.0", "base_url": "https://hapi.humdata.org/api/v2", "total_endpoints": 25, "available_tools": 28, "description": "MCP server for Humanitarian Data Exchange API" }
hdx_get_dataset_info

Get detailed information about a specific HDX dataset.

Parameters:

  • dataset_hdx_id (string): The HDX dataset identifier

Example:

{ "dataset_hdx_id": "dataset-123", "status": "success", "dataset": { "dataset_hdx_title": "Afghanistan - Baseline Population", "hdx_provider_name": "OCHA", "...": "..." } }
hdx_search_locations

Search for locations (countries) in the HDX system.

Parameters:

  • name_pattern (optional string): Pattern to match location names
  • has_hrp (optional boolean): Filter for locations with Humanitarian Response Plans

Example:

{ "status": "success", "locations": [ { "code": "AFG", "name": "Afghanistan", "has_hrp": true } ], "count": 1, "filters_applied": { "name_pattern": "Afghanistan", "has_hrp": true } }

Tool Categories and Tags

Tools are automatically categorized with relevant tags:

  • Metadata Tools: metadata, reference
  • Affected People Tools: affected-people, humanitarian
  • Climate Tools: climate, environmental
  • Coordination Tools: coordination, humanitarian
  • Food Security Tools: food-security, nutrition, poverty
  • Geography Tools: geography, infrastructure, population
  • Utility Tools: utility, system

All tools also include global tags: hdx, humanitarian, data

Configuration

Environment Variables

The server is configured via environment variables. Copy env.example to .env and customize:

VariableRequiredDefaultDescription
HDX_API_KEYYes-Your HDX API key
HDX_BASE_URLNohttps://hapi.humdata.org/api/v2HDX API base URL
HDX_OPENAPI_URLNohttps://hapi.humdata.org/openapi.jsonOpenAPI spec URL
HDX_TIMEOUTNo30.0HTTP request timeout (seconds)
HDX_APP_NAMENohdx-mcp-serverApplication name for HDX
HDX_APP_EMAILNoassistant@example.comContact email for user of HDX API
MCP_HOSTNolocalhostHost for HTTP transport
MCP_PORTNo8000Port for HTTP transport

Complete Configuration Example

See env.example for a complete configuration template with detailed comments:

# View the configuration template cat env.example # Copy and edit for your environment cp env.example .env nano .env

Excluded Endpoints

The server automatically excludes the following endpoints:

  • /api/v2/encode_app_identifier - Internal utility for generating app identifiers

Adding Custom Tools

To add your own tools alongside the auto-generated ones, modify the _register_custom_tools method in hdx_mcp_server.py:

def _register_custom_tools(self): """Register custom tools alongside the OpenAPI-derived ones.""" @self.mcp.tool("my_custom_tool") async def my_custom_tool(param1: str, param2: int = 10) -> Dict[str, Any]: """ Description of my custom tool. Args: param1: Description of parameter 1 param2: Description of parameter 2 with default value Returns: Dictionary containing the results """ # Your custom tool implementation return {"result": "success", "data": param1}

Development

Development Setup with UV

# Install all dependencies including development tools uv sync --all-extras --dev # Install production dependencies only uv sync # Install pre-commit hooks (recommended) uv run pre-commit install

Development Commands

# Run tests uv run pytest # Run tests with coverage uv run pytest --cov=src --cov-report=html --cov-report=term-missing # Run specific test categories uv run pytest tests/unit/ # Unit tests uv run pytest tests/integration/ # Integration tests uv run pytest tests/security/ # Security tests # Code quality checks uv run black src/ tests/ # Format code uv run isort src/ tests/ # Sort imports uv run flake8 src/ tests/ # Lint code uv run mypy src/ # Type checking uv run bandit -r src/ # Security scanning # Run all quality checks uv run pre-commit run --all-files # Clean cache files find . -type d -name "__pycache__" -exec rm -rf {} + find . -type f -name "*.pyc" -delete rm -rf .pytest_cache .mypy_cache .coverage htmlcov/

Project Structure

hdx-mcp-server/ ├── src/hdx_mcp_server/ # Main source code │ ├── __init__.py # Package initialization │ ├── __main__.py # CLI entry point │ └── server.py # Main server implementation ├── tests/ # Test suite │ ├── unit/ # Unit tests │ ├── integration/ # Integration tests │ ├── security/ # Security tests │ └── conftest.py # Shared test fixtures ├── .github/workflows/ # GitHub Actions CI/CD └── pyproject.toml # Project configuration

Testing

Running Tests
# All tests uv run pytest # Specific test categories uv run pytest tests/unit/ -v # Unit tests only uv run pytest tests/integration/ -v # Integration tests only uv run pytest tests/security/ -v # Security tests only # With coverage reporting uv run pytest --cov=src --cov-report=html
Test Categories
  • Unit Tests (tests/unit/): Test individual components in isolation
  • Integration Tests (tests/integration/): Test interactions with external APIs
  • Security Tests (tests/security/): Test security features and protections

Test Coverage

The test suite includes:

  • Unit Tests: Server initialization, configuration, authentication
  • Integration Tests: Real API interaction (when API key available)
  • Security Tests: API key protection, input validation, error handling
  • Edge Case Tests: Malformed specs, network errors, timeouts

Manual Testing

Test the server manually with curl (HTTP transport):

# Start server uv run hdx-mcp-server --transport http # Test server info tool curl -X POST http://localhost:8000/mcp \ -H "Content-Type: application/json" \ -d '{"method": "tools/call", "params": {"name": "hdx_server_info", "arguments": {}}}'

Security Considerations

This server follows MCP security best practices:

Authentication & Authorization

  • ✅ API keys stored in environment variables, never hardcoded
  • ✅ API keys not logged or exposed in error messages
  • ✅ Proper HTTP client configuration with timeouts
  • ✅ Base64 encoding for app identifiers as required by HDX

Input Validation

  • ✅ FastMCP handles input schema validation automatically
  • ✅ Custom tools include proper type hints and validation
  • ✅ Error handling prevents information leakage

Network Security

  • ✅ HTTPS-only communication with HDX API
  • ✅ Configurable timeouts to prevent hanging connections
  • ✅ Proper error handling for network failures

Data Handling

  • ✅ No persistent data storage
  • ✅ Proper cleanup of HTTP connections
  • ✅ Graceful shutdown handling

Troubleshooting

Common Issues

  1. "Required environment variable HDX_API_KEY is not set"
    • Ensure you have copied .env.example to .env
    • Verify your HDX API key is correctly set in .env
  2. "Failed to load OpenAPI specification"
    • Check your internet connection
    • Verify HDX API is accessible: curl https://hapi.humdata.org/openapi.json
  3. HTTP transport not accessible.
    • Check if the port is already in use
    • Verify firewall settings if accessing remotely
    • Use --host 0.0.0.0 for external access
  4. Authentication errors
    • Verify your HDX API key is valid
    • Check if your account has necessary permissions

Logging

Enable verbose logging for debugging:

uv run hdx-mcp-server --verbose

This will show detailed information about:

  • Server initialization
  • OpenAPI spec loading
  • Tool registration
  • HTTP requests and responses
  • Error details

Contributing

Development Setup

  1. Install dependencies:
    uv sync --all-extras # Install all dependencies including dev tools uv run pre-commit install # Install pre-commit hooks
  2. Development workflow:
    uv run black src/ tests/ # Format code uv run flake8 src/ tests/ # Lint code uv run pytest # Run tests uv run pre-commit run --all-files # Run all checks

Code Quality & CI/CD

This project includes comprehensive quality controls:

Code Quality Tools
  • Black: Code formatting
  • isort: Import sorting
  • flake8: Linting with security and complexity extensions
  • mypy: Type checking
  • bandit: Security vulnerability scanning
  • pytest: Testing with coverage reporting
GitHub Actions
  • CI Pipeline: Tests on Python 3.8-3.12
  • Code Quality: Automated formatting, linting, and type checking
  • Security Scanning: Vulnerability detection and security tests
  • Integration Tests: Full API integration testing
Pre-commit Hooks

Automatically run quality checks before each commit:

uv run pre-commit install # One-time setup git commit # Hooks run automatically

See CONTRIBUTING.md for detailed development guidelines.

Adding New Endpoints

New HDX API endpoints are automatically included when the OpenAPI specification is updated. No code changes are required.

Adding Custom Tools

  1. Add your tool function in the _register_custom_tools method
  2. Include proper type hints and documentation
  3. Add corresponding tests in test_hdx_mcp_server.py
  4. Update this README with documentation for your tool

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For issues related to:


Note: This server requires a valid HDX API key. Please ensure you comply with HDX's terms of service and rate limiting guidelines when using this server.

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

Provides AI assistants with seamless access to the Humanitarian Data Exchange (HDX) API for accessing humanitarian datasets, population data, conflict events, food security information, and other critical humanitarian data. Automatically generates MCP tools from the HDX OpenAPI specification with additional custom tools for enhanced functionality.

  1. Features
    1. Caveats
  2. Quick Start
    1. Installation
      1. Prerequisites
      2. Setup Instructions
    2. Usage
      1. Running the Server
      2. Command Line Options
    3. Using with Claude Desktop
      1. Setup Steps
      2. Verification
      3. Troubleshooting
      4. Alternative: HTTP Transport
    4. Available Tools
      1. Auto-Generated Tools (Examples)
      2. Custom Tools
    5. Tool Categories and Tags
      1. Configuration
        1. Environment Variables
        2. Complete Configuration Example
        3. Excluded Endpoints
      2. Adding Custom Tools
        1. Development
          1. Development Setup with UV
          2. Development Commands
          3. Project Structure
          4. Testing
          5. Test Coverage
          6. Manual Testing
        2. Security Considerations
          1. Authentication & Authorization
          2. Input Validation
          3. Network Security
          4. Data Handling
        3. Troubleshooting
          1. Common Issues
          2. Logging
        4. Contributing
          1. Development Setup
          2. Code Quality & CI/CD
          3. Adding New Endpoints
          4. Adding Custom Tools
        5. License
          1. Related Resources
            1. Support

              Related MCP Servers

              • -
                security
                A
                license
                -
                quality
                An MCP server implementation that standardizes how AI applications access tools and context, providing a central hub that manages tool discovery, execution, and context management with a simplified configuration system.
                Last updated -
                13
                MIT License
              • A
                security
                F
                license
                A
                quality
                An MCP server that integrates Apifox API documentation with AI assistants, allowing AI to extract and understand API information from Apifox projects.
                Last updated -
                2
                28
              • -
                security
                F
                license
                -
                quality
                This server enables interacting with the National Digital Health Mission's Health Information User (HIU) API, allowing agents to access and manage health information through the Multi-Agent Conversation Protocol.
                Last updated -
              • -
                security
                F
                license
                -
                quality
                A service that converts OpenAPI specifications into MCP tools, enabling AI assistants to interact with your API endpoints through natural language.
                Last updated -

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/dividor/hxp-mcp2'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server