Provides comprehensive tools for managing and analyzing Apache Druid clusters, including data management, ingestion management, monitoring, and health checks. Enables executing SQL queries, managing datasources, compaction, retention rules, segments, and streaming ingestion supervisors.
Supports deployment via Docker with pre-built images available on Docker Hub, offering configuration through environment variables for connecting to Druid clusters with options for both SSE and STDIO transport modes.
Planned future integration for deploying the Druid MCP Server on Kubernetes clusters, mentioned in the roadmap section.
Integrates with PostgreSQL as the metadata storage for Apache Druid clusters, supporting the backend infrastructure for the Druid management capabilities.
Druid MCP Server
A comprehensive Model Context Protocol (MCP) server for Apache Druid that provides extensive tools, resources, and prompts for managing and analyzing Druid clusters.
Developed by
Overview
This MCP server implements a feature-based architecture where each package represents a distinct functional area of Druid management. The server provides three main types of MCP components:
Tools - Executable functions for performing operations
Resources - Data providers for accessing information
Prompts - AI-assisted guidance templates
Video Walkthrough
Learn how to integrate AI agents with Apache Druid using the MCP server. This tutorial demonstrates time series data exploration, statistical analysis, and data ingestion using natural language with AI assistants like Claude, ChatGPT, and Gemini.
Click the thumbnail above to watch the video on YouTube
Features
Spring AI MCP Server integration
Tool-based architecture for MCP protocol compliance
Tool-based Architecture: Complete MCP protocol compliance with automatic JSON schema generation
Multiple Transport Modes: STDIO, SSE, and Streamable HTTP support
Real-time Communication: Server-Sent Events with streaming capabilities
Comprehensive error handling
Customizable Prompt Templates: AI-assisted guidance with template customization
Comprehensive Error Handling: Graceful error handling with meaningful responses
Architecture & Organization
Feature-based Package Organization: Each package represents a distinct Druid management area
Auto-discovery: Automatic registration of tools, resources, and prompts via annotations
Enterprise Ready: Production-grade configuration and security features
MCP Inspector Interface
When connected to an MCP client, you can inspect the available tools, resources, and prompts through the MCP inspector interface:
Available Tools
The tools interface shows all available Druid management functions organized by feature areas including data management, ingestion management, and monitoring & health.
Available Resources
The resources interface displays all accessible Druid data sources and metadata that can be retrieved through the MCP protocol.
Available Prompts
The prompts interface shows all AI-assisted guidance templates available for various Druid management tasks and data analysis workflows.
Quick Start
MCP Configuration for LLMs
A ready-to-use MCP configuration file is provided at mcp-servers-config.json
that can be used with LLM clients to connect to this Druid MCP server.
Examples
The configuration includes both transport options:
STDIO: STDIO-based streaming connection via command-line.
SSE: HTTP-based streaming connection via Server-Sent Events.
Streamable HTTP Configuration Modern single-endpoint HTTP transport per MCP 2025-06-18.
Docker examples using environment variables:
Prerequisites
Java 24
Maven 3.6+
Apache Druid cluster running with router on port 8888
Build and Run
The server will start on port 8080 by default.
For detailed build instructions, testing, Docker setup, and development guidelines, see development.md.
Installation from Maven Central
If you prefer to use the pre-built JAR without building from source, you can download and run it directly from Maven Central.
Prerequisites
Java 24 JRE only
Download and Run
Download the JAR from Maven Central https://repo.maven.apache.org/maven2/com/iunera/druid-mcp-server/
For Developers
For detailed development information including build instructions, testing guidelines, architecture details, and contributing guidelines, see development.md.
Available Tools by Feature
The MCP server auto-discovers all tools via annotations. In Read-only mode, any tool that would modify the Druid cluster is not registered and will not appear in the MCP client. The lists below reflect the current implementation.
Data Management
Feature | Tool | Description | Parameters |
Datasource |
| List all available Druid datasource names | None |
Datasource |
| Show detailed information for a specific datasource including column information |
(String) |
Datasource |
| Kill a datasource permanently, removing all data and metadata |
(String),
(String) |
Lookup |
| List all available Druid lookups from the coordinator | None |
Lookup |
| Get configuration for a specific lookup |
(String),
(String) |
Lookup |
| Update configuration for a specific lookup |
(String),
(String),
(String) |
Segments |
| List all segments across all datasources | None |
Segments |
| Get metadata for specific segments |
(String),
(String) |
Segments |
| Get all segments for a specific datasource |
(String) |
Query |
| Execute a SQL query against Druid datasources |
(String) |
Retention |
| View retention rules for all datasources or a specific one |
(String, optional) |
Retention |
| Update retention rules for a datasource |
(String),
(String) |
Compaction |
| View compaction configurations for all datasources | None |
Compaction |
| View compaction configuration for a specific datasource |
(String) |
Compaction |
| Edit compaction configuration for a datasource |
(String),
(String) |
Compaction |
| Delete compaction configuration for a datasource |
(String) |
Compaction |
| View compaction status for all datasources | None |
Compaction |
| View compaction status for a specific datasource |
(String) |
Ingestion Management
Feature | Tool | Description | Parameters |
Ingestion Spec |
| Create a batch ingestion template |
(String),
(String),
(String) |
Ingestion Spec |
| Create and submit an ingestion specification |
(String) |
Supervisors |
| List all streaming ingestion supervisors | None |
Supervisors |
| Get status of a specific supervisor |
(String) |
Supervisors |
| Suspend a streaming supervisor |
(String) |
Supervisors |
| Start or resume a streaming supervisor |
(String) |
Supervisors |
| Terminate a streaming supervisor |
(String) |
Tasks |
| List all ingestion tasks | None |
Tasks |
| Get status of a specific task |
(String) |
Tasks |
| Shutdown a running task |
(String) |
Monitoring & Health
Feature | Tool | Description | Parameters |
Basic Health |
| Check overall cluster health status | None |
Basic Health |
| Get status of specific Druid services |
(String) |
Basic Health |
| Get cluster configuration information | None |
Diagnostics |
| Run comprehensive cluster diagnostics | None |
Diagnostics |
| Analyze cluster performance issues | None |
Diagnostics |
| Generate detailed health report | None |
Functionality |
| Test query functionality across services | None |
Functionality |
| Test ingestion functionality | None |
Functionality |
| Validate connectivity between cluster components | None |
Available Resources by Feature
Feature | Resource URI Pattern | Description | Parameters |
Datasource |
| Access datasource information and metadata |
(String) |
Datasource |
| Access detailed datasource information including schema |
(String) |
Lookup |
| Access lookup configuration and data |
(String),
(String) |
Segments |
| Access segment metadata and information |
(String) |
Available Prompts by Feature
Feature | Prompt Name | Description | Parameters |
Data Analysis |
| Guide for exploring data in Druid datasources |
(String, optional) |
Data Analysis |
| Help optimize Druid SQL queries for better performance |
(String) |
Cluster Management |
| Comprehensive cluster health assessment guidance | None |
Cluster Management |
| Overview and analysis of cluster status | None |
Ingestion Management |
| Troubleshoot ingestion issues |
(String, optional) |
Ingestion Management |
| Guide for setting up new ingestion pipelines |
(String, optional) |
Retention Management |
| Manage data retention policies |
(String, optional) |
Compaction |
| Optimize segment compaction configuration |
(String, optional),
(String, optional),
(String, optional) |
Compaction |
| Troubleshoot compaction issues |
(String),
(String, optional) |
Operations |
| Emergency response procedures and guidance | None |
Operations |
| Cluster maintenance procedures | None |
Environment Variables Configuration
For sensitive credentials like username and password, you can use environment variables instead of hardcoding them in properties files.
Supported Environment Variables
DRUID_AUTH_USERNAME
: Druid authentication usernameDRUID_AUTH_PASSWORD
: Druid authentication passwordDRUID_ROUTER_URL
: Override the default Druid router URLDRUID_SSL_ENABLED
: Enable SSL/TLS support (true/false)DRUID_SSL_SKIP_VERIFICATION
: Skip SSL certificate verification (true/false)
SSL-Encrypted Cluster with Authentication
This section provides comprehensive guidance on connecting to SSL-encrypted Druid clusters with username and password authentication.
Prerequisites
SSL-enabled Druid cluster with HTTPS endpoints
Valid username and password credentials for Druid authentication
SSL certificates properly configured (or ability to skip verification for testing)
Configuration Methods
Method 1: Environment Variables (Recommended for Production)
Set the following environment variables before starting the MCP server:
Method 2: Runtime System Properties
Pass configuration as JVM system properties:
SSL Configuration Options
Production SSL Setup
For production environments with valid SSL certificates:
The server will use the system's default truststore to validate SSL certificates.
Authentication Methods
The MCP server supports HTTP Basic Authentication with username and password:
Username: Set via
DRUID_AUTH_USERNAME
ordruid.auth.username
Password: Set via
DRUID_AUTH_PASSWORD
ordruid.auth.password
The credentials are automatically encoded using Base64 and sent with each request using the Authorization: Basic
header.
MCP Client Configuration with SSL
Update your mcp-servers-config.json
to include environment variables:
MCP Prompt Customization
The server provides extensive prompt customization capabilities through the prompts.properties
file located in src/main/resources/
.
Prompt Configuration Structure
The prompts.properties file contains:
Global Settings: Enable/disable prompts and set watermarks
Feature Toggles: Control which prompts are available
Custom Variables: Organization-specific information
Template Definitions: Full prompt templates for each feature
Overriding Prompts
You can override any prompt template using Java system properties with the -D
flag:
Method 1: System Properties (Runtime Override)
Method 2: Custom Properties File
Create a custom properties file (e.g.,
custom-prompts.properties
):
Load it at runtime:
Available Prompt Variables
All prompt templates support these variables:
Variable | Description | Example |
| Current environment name |
,
,
|
| Organization name |
|
| Contact information |
|
| Generated watermark |
|
| Datasource name (context-specific) |
|
| SQL query (context-specific) |
|
Prompt Template Examples
Custom Data Exploration Prompt
Custom Query Optimization Prompt
Disabling Specific Prompts
You can disable individual prompts by setting their enabled flag to false:
Or disable all prompts globally:
MCP Integration
This server uses Spring AI's MCP Server framework and supports both STDIO and SSE transports. The tools, resources, and prompts are automatically registered and exposed through the MCP protocol.
Transport Modes
The Druid MCP Server supports multiple transport modes compliant with MCP 2025-06-18 specification:
Streamable HTTP Transport (Recommended and Default - New in MCP 2025-06-18)
The new Streamable HTTP transport provides enhanced performance and scalability with support for multiple concurrent clients:
Note: The -Dspring.ai.mcp.server.protocol
option is deprecated and no longer required. STREAMABLE
is the default protocol and is configured in application.properties
. If you previously set this flag, you can safely remove it.
Features:
Single Endpoint: One HTTP endpoint handles both POST and GET requests
Multiple Clients: Support for concurrent client connections
Optional SSE Streaming: Server-Sent Events for real-time updates
Enhanced Security: Origin header validation and authentication
Backwards Compatibility: Automatic fallback for older MCP clients
Keep-alive: Configurable connection health monitoring
STDIO Transport (Command-line Integration)
Perfect for LLM clients and desktop applications:
Legacy SSE Transport (Deprecated)
Still supported for backwards compatibility. It is no longer the default and may be removed in a future version.
Read-only Mode
Read-only mode prevents any operation that could mutate your Druid cluster while still allowing safe read operations and SQL queries. When enabled:
All HTTP GET requests are allowed
HTTP POST is allowed only to the exact path /druid/v2/sql (for SELECT and other read-only SQL)
Any other HTTP method (PUT, PATCH, DELETE) is blocked
Any other POST endpoint (e.g. ingestion/task endpoints) is blocked
MCP write tools are not registered, so they will not appear in your MCP client’s tool list
Enable Read-only Mode
You can enable it using any of the following methods:
application.properties
Environment variable
JVM system property
Docker
What changes in read-only mode?
Tools that would modify the cluster are disabled and won’t be listed by the MCP client. Examples include:
Segment state changes (enableSegment, disableSegment)
Datasource deletion (killDatasource)
Retention rule edits (editRetentionRulesForDatasource)
Compaction config edits (editCompactionConfigForDatasource, deleteCompactionConfigForDatasource)
Lookup changes (createOrUpdateLookup, deleteLookup)
Supervisor control (suspendSupervisor, startSupervisor, terminateSupervisor)
Task control (killTask)
Multi-stage SQL task operations (queryDruidMultiStage, queryDruidMultiStageWithContext, getMultiStageQueryTaskStatus, cancelMultiStageQueryTask)
Ingestion spec submission and templates (createIngestionSpec, createBatchIngestionTemplate)
Read-only-safe tools remain available, including SQL queries (queryDruidSql), metadata and status lookups, health diagnostics, task and segment inspection, etc.
🐳 Druid Cluster Setup
Complete Docker Compose configuration for running a full Apache Druid cluster locally. Perfect for development, testing, and learning about Druid cluster architecture.
Features:
Full Druid cluster with all components (Coordinator, Broker, Historical, MiddleManager, Router)
PostgreSQL metadata storage and ZooKeeper coordination
Pre-configured with sample data and ingestion examples
Integrated Druid MCP Server for immediate testing
~~## Related Projects
This Druid MCP Server is part of a comprehensive ecosystem of Apache Druid tools and extensions developed by iunera. These complementary projects enhance different aspects of Druid cluster management and data ingestion:
🔧 Druid Cluster Configuration
Advanced configuration management and deployment tools for Apache Druid clusters. This project provides:
Automated Cluster Setup: Streamlined configuration templates for different deployment scenarios
Configuration Management: Best practices and templates for production Druid clusters
Deployment Automation: Tools and scripts for consistent cluster deployments
Environment-Specific Configs: Optimized configurations for development, staging, and production environments
Integration with Druid MCP Server: The cluster configurations provided by this project work seamlessly with the monitoring and management capabilities of the Druid MCP Server, enabling comprehensive cluster lifecycle management.
📊 Code Ingestion Druid Extension
A specialized Apache Druid extension for ingesting and analyzing code-related data and metrics. This extension enables:
Code Metrics Ingestion: Specialized parsers for code analysis data and software metrics
Developer Analytics: Tools for analyzing code quality, complexity, and development patterns
CI/CD Integration: Seamless integration with continuous integration and deployment pipelines
Custom Data Formats: Support for various code analysis tools and formats
Integration with Druid MCP Server: This extension expands the ingestion capabilities that can be managed through the MCP server's ingestion management tools, providing specialized support for code analytics use cases.
Why Use These Together?
Complete Ecosystem: From cluster setup to specialized data ingestion and management
Consistent Architecture: All projects follow similar design principles and integration patterns
Enhanced Capabilities: Each project extends different aspects of the Druid ecosystem
Production Ready: Battle-tested configurations and extensions for enterprise deployments
Roadmap
Authentication on SSE/HTTP Mode: Introduce Oauth Authentication
Druid Auto Compaction: Intelligent automatic compaction configuration
MCP Auto Completion: Enhanced autocomplete functionality with sampling using McpComplete
MCP Notifications: Real-time notifications for MCP operations
Proper Observability: Comprehensive metrics and tracing
Enhanced Monitoring: Advanced cluster monitoring and alerting capabilities
Advanced Analytics: Machine learning-powered insights and recommendations
Security Enhancements: Advanced authentication and authorization features
Kubernetes Support: Proper deployment on Kubernetes
About iunera
This Druid MCP Server is developed and maintained by iunera, a leading provider of advanced AI and data analytics solutions.
iunera specializes in:
AI-Powered Analytics: Cutting-edge artificial intelligence solutions for data analysis
Enterprise Data Platforms: Scalable data infrastructure and analytics platforms (Druid, Flink, Kubernetes, Kafka, Spring)
Model Context Protocol (MCP) Solutions: Advanced MCP server implementations for various data systems
Custom AI Development: Tailored AI solutions for enterprise needs
As veterans in Apache Druid iunera deployed and maintained a large number of solutions based on Apache Druid in productive enterprise grade scenarios.
For more information about our services and solutions, visit www.iunera.com.
Contact & Support
Need help? Let
Website: https://www.iunera.com
Professional Services: Contact us through www.iunera.com or email for enterprise support and custom development
Open Source: This project is open source and community contributions are welcome
© 2024
This server cannot be installed
remote-capable server
The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.
Natural Language Analytics MCP server for Apache Druid. With this enterprise ready server in Java, one can query Apache Druid by natural language queries. Furthermore, the complete management like data loading of the Time Series Database Druid can be done with natural language commands.
Related MCP Servers
- -securityAlicense-qualityAn MCP server that enables large language models to interact directly with MongoDB databases, allowing them to query collections, inspect schemas, and manage data through natural language.Last updated -100MIT License
- -securityAlicense-qualityAn MCP server that provides natural language interaction with Apache AGE graph databases, allowing users to query, visualize and manipulate graph data in PostgreSQL through Claude AI.Last updated -1MIT License
- -securityFlicense-qualityAn MCP server that integrates with Claude to provide smart documentation search capabilities across multiple AI/ML libraries, allowing users to retrieve and process technical information through natural language queries.Last updated -
- AsecurityAlicenseAqualityAn MCP server that enables natural language interaction with Apache Iceberg data lakehouses, allowing users to query table metadata, schemas, and properties through Claude, Cursor, or other MCP clients.Last updated -538Apache 2.0