Provides tools for executing PromQL and Loki queries, retrieving dashboard configurations, fetching panel data, accessing label values, retrieving dashboard variables, listing all dashboards, accessing datasource configurations, and fetching folder metadata from a Grafana instance.
Enables executing PromQL queries against Prometheus datasources configured in Grafana, optimizing time series responses to reduce token size.
Grafana MCP Server
Available Tools
The following tools are available via the MCP server:
- test_connection: Verify connectivity to your Grafana instance and configuration.
- grafana_promql_query: Execute PromQL queries against Grafana's Prometheus datasource. Fetches metrics data using PromQL expressions, optimizes time series responses to reduce token size.
- grafana_loki_query: Query Grafana Loki for log data. Fetches logs for a specified duration (e.g., '5m', '1h', '2d'), converts relative time to absolute timestamps.
- grafana_get_dashboard_config: Retrieves dashboard configuration details from the database. Queries the connectors_connectormetadatamodelstore table for dashboard metadata.
- grafana_query_dashboard_panels: Execute queries for specific dashboard panels. Can query up to 4 panels at once, supports template variables, optimizes metrics data.
- grafana_fetch_label_values: Fetch label values for dashboard variables from Prometheus datasource. Retrieves available values for specific labels (e.g., 'instance', 'job').
- grafana_fetch_dashboard_variables: Fetch all variables and their values from a Grafana dashboard. Retrieves dashboard template variables and their current values.
- grafana_fetch_all_dashboards: Fetch all dashboards from Grafana with basic information like title, UID, folder, tags, etc.
- grafana_fetch_datasources: Fetch all datasources from Grafana with their configuration details.
- grafana_fetch_folders: Fetch all folders from Grafana with their metadata and permissions.
🚀 Usage & Requirements
1. Get Your Grafana API Endpoint & API Key
- Ensure you have a running Grafana instance (self-hosted or cloud).
- Generate an API key from your Grafana UI:
- Go to Configuration → API Keys
- Create a new API key with appropriate permissions (Admin role recommended for full access)
- Copy the API key (starts with
glsa_
)
2. Installation & Running Options
2A. Install & Run with uv (Recommended for Local Development)
2A.1. Install dependencies with uv
2A.2. Run the server with uv
- You can also use
uv
to run any other entrypoint scripts as needed. - Make sure your
config.yaml
is in the same directory asmcp_server.py
or set the required environment variables (see Configuration section).
2B. Run with Docker Compose (Recommended for Production/Containerized Environments)
- Edit
grafana-mcp-server/src/grafana_mcp_server/config.yaml
with your Grafana details (host, API key). - Start the server:
- The server will run in HTTP (SSE) mode on port 8000 by default.
- You can override configuration with environment variables (see below).
2C. Run with Docker Image (Manual)
- Build the image:
- Run the container (YAML config fallback):
- Or run with environment variables (recommended for CI/Docker MCP clients):
3. Configuration
The server loads configuration in the following order of precedence:
- Environment Variables (recommended for Docker/CI):
GRAFANA_HOST
: Grafana instance URL (e.g.https://your-grafana-instance.com
)GRAFANA_API_KEY
: Grafana API key (required)GRAFANA_SSL_VERIFY
:true
orfalse
(default:true
)MCP_SERVER_PORT
: Port to run the server on (default:8000
)MCP_SERVER_DEBUG
:true
orfalse
(default:true
)
- YAML file fallback (
config.yaml
):
4. Integration with AI Assistants (e.g., Claude Desktop, Cursor)
You can integrate this MCP server with any tool that supports the MCP protocol. Here are the main options:
4A. Using Local Setup (with uv)
Before running the server locally, install dependencies and run with uv:
Then add to your client configuration (e.g., claude-desktop.json
):
- Ensure your
config.yaml
is in the same directory asmcp_server.py
or update the path accordingly.
4B. Using Docker Compose or Docker (with environment variables)
- The
-t stdio
argument is supported for compatibility with Docker MCP clients (forces stdio handshake mode). - Adjust the volume path or environment variables as needed for your deployment.
4C. Connecting to an Already Running MCP Server (HTTP/SSE)
If you have an MCP server already running (e.g., on a remote host, cloud VM, or Kubernetes), you can connect your AI assistant or tool directly to its HTTP endpoint.
Example: Claude Desktop or Similar Tool
- Replace
your-server-host
with the actual host where your MCP server is running. - For local setup, use
localhost
as the server host (i.e.,http://localhost:8000/mcp
). - Use
http
for local or unsecured deployments, andhttps
for production or secured deployments. - Make sure the server is accessible from your client machine (check firewall, security group, etc.).
Example: MCP Config YAML
- Replace
your-server-host
with the actual host where your MCP server is running. - For local setup, use
localhost
as the server host (i.e.,http://localhost:8000/mcp
). - Use
http
orhttps
in the URL schema depending on how you've deployed the MCP server. - No need to specify
command
orargs
—just point to the HTTP endpoint. - This works for any tool or assistant that supports MCP over HTTP.
- The server must be running in HTTP (SSE) mode (the default for this implementation).
Health Check
The server runs on port 8000 by default.
5. Project Structure
6. Troubleshooting
Common Issues
- Connection Failed:
- Verify your Grafana instance is running and accessible
- Check your API key has proper permissions
- Ensure SSL verification settings match your setup
- Authentication Errors:
- Verify your API key is correct and not expired
- Check if your Grafana instance requires additional authentication
- Query Failures:
- Ensure datasource UIDs are correct
- Verify PromQL/Loki query syntax
- Check if the datasource is accessible with your API key
Debug Mode
Enable debug mode to get more detailed logs:
7. Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
8. License
This project is licensed under the MIT License - see the LICENSE file for details.
9. Support
- Need help anywhere? Join our slack community and message on #mcp channel.
- Want a 1-click MCP Server? Join the same community and let us know.
- For issues and questions, please open an issue on GitHub or contact the maintainers.
This server cannot be installed
A server that enables AI assistants to access and query Grafana dashboards, metrics, logs, and configurations through an MCP protocol interface.
Related MCP Servers
- AsecurityAlicenseAqualityMCP-compatible server that enables AI assistants to interact with Lightdash analytics data, providing tools to list and retrieve projects, spaces, charts, dashboards, and metrics through a standardized interface.Last updated -131717TypeScriptMIT License
- -securityFlicense-qualityAn MCP server that enables AI assistants to access and interact with Google Classroom data, allowing users to view courses, course details, and assignments through natural language commands.Last updated -6541JavaScript
- AsecurityAlicenseAqualityA sophisticated MCP server that provides a multi-dimensional, adaptive reasoning framework for AI assistants, replacing linear reasoning with a graph-based architecture for more nuanced cognitive processes.Last updated -13026TypeScriptMIT License
- -securityAlicense-qualityEnables AI assistants to interact with and manage Google Cloud Platform resources including Compute Engine, Cloud Run, Storage, BigQuery, and other GCP services through a standardized MCP interface.Last updated -3PythonMIT License