Integrates with GitHub Copilot in VS Code, allowing AI agents to access and analyze Prometheus monitoring data through natural language interactions.
Enables natural language querying and analysis of Prometheus metrics, with tools for metric discovery, label exploration, target inspection, and executing both instant and range queries against a Prometheus monitoring server.
A Model Context Protocol (MCP) server that provides seamless integration between AI assistants and Prometheus, enabling natural language interactions with your monitoring infrastructure. This server allows for effortless querying, discovery, and analysis of metrics through Visual Studio Code, Cursor, Windsurf, Claude Desktop, and other MCP clients.
Key Features
- Fast and lightweight. Direct API integration with Prometheus, no complex parsing needed.
- LLM-friendly. Structured JSON responses optimized for AI assistant consumption.
- Configurable capabilities. Enable/disable tool categories based on your security and operational requirements.
- Dual transport support. Works with both stdio and HTTP transports for maximum compatibility.
Requirements
- Node.js 20.19.0 or newer
- Access to a Prometheus server
- VS Code, Cursor, Windsurf, Claude Desktop or any other MCP client
Getting Started
First, install the Prometheus MCP server with your client. A typical configuration looks like this:
After installation, the Prometheus MCP server will be available for use with your GitHub Copilot agent in VS Code.
Go to Cursor Settings
→ MCP
→ Add new MCP Server
. Name to your liking, use command
type with the command npx prometheus-mcp
. You can also verify config or add command arguments via clicking Edit
.
Follow Windsurf MCP documentation. Use the following configuration:
Claude Desktop supports two installation methods:
Option 1: DXT Extension
The easiest way to install is using the pre-built DXT extension:
- Download the latest
.dxt
file from the releases page - Double-click the downloaded file to install automatically
- Configure your Prometheus URL in the extension settings
Option 2: Developer Settings
For advanced users or custom configurations, manually configure the MCP server:
- Open Claude Desktop settings
- Navigate to the Developer section
- Add the following MCP server configuration:
Configuration
Prometheus MCP server supports the following arguments. They can be provided in the JSON configuration above, as part of the "args"
list:
Environment Variables
You can also configure the server using environment variables:
PROMETHEUS_URL
- Prometheus server URLENABLE_DISCOVERY_TOOLS
- Set to "false" to disable discovery tools (default: true)ENABLE_INFO_TOOLS
- Set to "false" to disable info tools (default: true)ENABLE_QUERY_TOOLS
- Set to "false" to disable query tools (default: true)
Standalone MCP Server
When running in server environments or when you need HTTP transport, run the MCP server with the http
command:
And then in your MCP client config, set the url
to the HTTP endpoint:
Docker
Run the Prometheus MCP server using Docker:
Tools
The Prometheus MCP server provides 10 tools organized into three configurable categories:
Tools for exploring your Prometheus infrastructure:
prometheus_list_metrics
- Description: List all available Prometheus metrics
- Parameters: None
- Read-only: true
prometheus_metric_metadata
- Description: Get metadata for a specific Prometheus metric
- Parameters:
metric
(string): Metric name to get metadata for
- Read-only: true
prometheus_list_labels
- Description: List all available Prometheus labels
- Parameters: None
- Read-only: true
prometheus_label_values
- Description: Get all values for a specific Prometheus label
- Parameters:
label
(string): Label name to get values for
- Read-only: true
prometheus_list_targets
- Description: List all Prometheus scrape targets
- Parameters: None
- Read-only: true
prometheus_scrape_pool_targets
- Description: Get targets for a specific scrape pool
- Parameters:
scrapePool
(string): Scrape pool name
- Read-only: true
Tools for accessing Prometheus server information:
prometheus_runtime_info
- Description: Get Prometheus runtime information
- Parameters: None
- Read-only: true
prometheus_build_info
- Description: Get Prometheus build information
- Parameters: None
- Read-only: true
Tools for executing Prometheus queries:
prometheus_query
- Description: Execute an instant Prometheus query
- Parameters:
query
(string): Prometheus query expressiontime
(string, optional): Time parameter for the query (RFC3339 format)
- Read-only: true
prometheus_query_range
- Description: Execute a Prometheus range query
- Parameters:
query
(string): Prometheus query expressionstart
(string): Start timestamp (RFC3339 or unix timestamp)end
(string): End timestamp (RFC3339 or unix timestamp)step
(string): Query resolution step width
- Read-only: true
Example Usage
Here are some example interactions you can have with your AI assistant:
Basic Queries
- "Show me all available metrics in Prometheus"
- "What's the current CPU usage across all instances?"
- "Get the memory usage for the last hour"
Discovery and Exploration
- "List all scrape targets and their status"
- "What labels are available for the
http_requests_total
metric?" - "Show me all metrics related to 'cpu'"
Advanced Analysis
- "Compare CPU usage between production and staging environments"
- "Show me the top 10 services by memory consumption"
- "What's the error rate trend for the API service over the last 24 hours?"
Security Considerations
- Network Access: The server requires network access to your Prometheus instance
- Resource Usage: Range queries can be resource-intensive; monitor your Prometheus server load
Troubleshooting
Connection Issues
- Verify your Prometheus server is accessible at the configured URL
- Check firewall settings and network connectivity
- Ensure Prometheus API is enabled (default on port 9090)
Permission Errors
- Verify the MCP server has network access to Prometheus
- Check if authentication is required for your Prometheus setup
Tool Availability
- If certain tools are missing, check if they've been disabled via configuration
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- GitHub Issues: Report bugs or request features
- Documentation: Model Context Protocol Documentation
- Prometheus: Prometheus Documentation
Built with ❤️ for the Prometheus and MCP communities
This server cannot be installed
Provides seamless integration between AI assistants and Prometheus, enabling natural language interactions with your monitoring infrastructure. This server allows for effortless querying, discovery, and analysis of metrics.
Related MCP Servers
- AsecurityAlicenseAqualityProvides comprehensive access to Roam Research's API functionality. This server enables AI assistants like Claude to interact with your Roam Research graph through a standardized interface.Last updated -1839652TypeScriptMIT License
- -securityAlicense-qualityThis server facilitates the invocation of AI models from providers like Anthropic, OpenAI, and Groq, enabling users to manage and configure large language model interactions seamlessly.Last updated -9PythonMIT License
- -securityFlicense-qualityEnables AI assistants to interact with Metabase, providing access to dashboards, questions, databases, and tools for executing queries and viewing data through natural language.Last updated -JavaScript
- AsecurityAlicenseAqualityProvides access to Prometheus metrics and queries through standardized Model Context Protocol interfaces, allowing AI assistants to execute PromQL queries and analyze metrics data.Last updated -5121PythonMIT License