Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault

No arguments

Tools

Functions exposed to the LLM to take actions

NameDescription
prometheus_query_range
Query Prometheus /query_range API endpoint and return an image of a plot of the time series monitoring data. Use this tool whenever the user asks about the status of their compute infrastructure.
prometheus_alert_rules

Query the Prometheus /rules API. Returns a list of alerting and recording rules that are currently loaded. In addition it returns the currently active alerts fired by the Prometheus instance of each alerting rule.

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription
file://promql_for_gcp_docs.md/

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/etruong42/prometheus-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server