Skip to main content
Glama
pab1it0

Prometheus MCP Server

List Available Metrics

list_metrics
Read-onlyIdempotent

Retrieve available Prometheus metrics with pagination and filtering options to identify and access monitoring data.

Instructions

List all available metrics in Prometheus with optional pagination support

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
limitNo
offsetNo
filter_patternNo

Output Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The main handler function for the 'list_metrics' tool. It fetches the list of metric names from Prometheus using the label/__name__/values endpoint, applies optional filtering by pattern, pagination with limit and offset, supports progress reporting via context, and returns a structured result with counts and pagination info.
    async def list_metrics(
        limit: Optional[int] = None,
        offset: int = 0,
        filter_pattern: Optional[str] = None,
        ctx: Context | None = None
    ) -> Dict[str, Any]:
        """Retrieve a list of all metric names available in Prometheus.
    
        Args:
            limit: Maximum number of metrics to return (default: all metrics)
            offset: Number of metrics to skip for pagination (default: 0)
            filter_pattern: Optional substring to filter metric names (case-insensitive)
    
        Returns:
            Dictionary containing:
            - metrics: List of metric names
            - total_count: Total number of metrics (before pagination)
            - returned_count: Number of metrics returned
            - offset: Current offset
            - has_more: Whether more metrics are available
        """
        logger.info("Listing available metrics", limit=limit, offset=offset, filter_pattern=filter_pattern)
    
        # Report progress if context available
        if ctx:
            await ctx.report_progress(progress=0, total=100, message="Fetching metrics list...")
    
        data = make_prometheus_request("label/__name__/values")
    
        if ctx:
            await ctx.report_progress(progress=50, total=100, message=f"Processing {len(data)} metrics...")
    
        # Apply filter if provided
        if filter_pattern:
            filtered_data = [m for m in data if filter_pattern.lower() in m.lower()]
            logger.debug("Applied filter", original_count=len(data), filtered_count=len(filtered_data), pattern=filter_pattern)
            data = filtered_data
    
        total_count = len(data)
    
        # Apply pagination
        start_idx = offset
        end_idx = offset + limit if limit is not None else len(data)
        paginated_data = data[start_idx:end_idx]
    
        result = {
            "metrics": paginated_data,
            "total_count": total_count,
            "returned_count": len(paginated_data),
            "offset": offset,
            "has_more": end_idx < total_count
        }
    
        if ctx:
            await ctx.report_progress(progress=100, total=100, message=f"Retrieved {len(paginated_data)} of {total_count} metrics")
    
        logger.info("Metrics list retrieved",
                    total_count=total_count,
                    returned_count=len(paginated_data),
                    offset=offset,
                    has_more=result["has_more"])
    
        return result
  • The @mcp.tool() decorator registers the list_metrics function as an MCP tool, providing a description and annotations for UI hints and metadata.
    @mcp.tool(
        description="List all available metrics in Prometheus with optional pagination support",
        annotations={
            "title": "List Available Metrics",
            "icon": "📋",
            "readOnlyHint": True,
            "destructiveHint": False,
            "idempotentHint": True,
            "openWorldHint": True
        }
    )
Behavior3/5

Does the description disclose side effects, auth requirements, rate limits, or destructive behavior?

Annotations provide readOnlyHint=true, destructiveHint=false, idempotentHint=true, and openWorldHint=true, covering safety and idempotency. The description adds 'optional pagination support', which is useful context beyond annotations, but it doesn't detail rate limits, authentication needs, or specific behavioral traits like response format or error handling. With annotations, the bar is lower, and this adds some value.

Agents need to know what a tool does to the world before calling it. Descriptions should go beyond structured annotations to explain consequences.

Conciseness5/5

Is the description appropriately sized, front-loaded, and free of redundancy?

The description is a single, efficient sentence that front-loads the core purpose ('List all available metrics in Prometheus') and adds a key feature ('with optional pagination support'). Every word earns its place, with no redundancy or unnecessary elaboration, making it highly concise and well-structured.

Shorter descriptions cost fewer tokens and are easier for agents to parse. Every sentence should earn its place.

Completeness4/5

Given the tool's complexity, does the description cover enough for an agent to succeed on first attempt?

Given the tool's low complexity (a read-only list operation), rich annotations (covering safety and idempotency), and the presence of an output schema (which handles return values), the description is reasonably complete. It covers the main action and a key feature (pagination), though it could benefit from more usage guidance or parameter details to be fully comprehensive.

Complex tools with many parameters or behaviors need more documentation. Simple tools need less. This dimension scales expectations accordingly.

Parameters3/5

Does the description clarify parameter syntax, constraints, interactions, or defaults beyond what the schema provides?

Schema description coverage is 0%, so the schema provides no descriptions for parameters. The description mentions 'optional pagination support', which hints at 'limit' and 'offset' parameters, but doesn't explain 'filter_pattern' or add detailed semantics beyond the schema's property names. It partially compensates for the low coverage but leaves gaps, aligning with the baseline when schema coverage is low.

Input schemas describe structure but not intent. Descriptions should explain non-obvious parameter relationships and valid value ranges.

Purpose4/5

Does the description clearly state what the tool does and how it differs from similar tools?

The description clearly states the verb 'List' and resource 'all available metrics in Prometheus', making the purpose evident. However, it doesn't differentiate from sibling tools like 'get_metric_metadata' or 'execute_query', which might also retrieve metric information, so it doesn't fully distinguish itself from alternatives.

Agents choose between tools based on descriptions. A clear purpose with a specific verb and resource helps agents select the right tool.

Usage Guidelines2/5

Does the description explain when to use this tool, when not to, or what alternatives exist?

The description mentions 'optional pagination support', which implies usage for handling large datasets, but it doesn't provide explicit guidance on when to use this tool versus siblings like 'get_metric_metadata' or 'execute_query'. No alternatives, exclusions, or specific contexts are stated, leaving the agent with minimal direction.

Agents often have multiple tools that could apply. Explicit usage guidance like "use X instead of Y when Z" prevents misuse.

Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pab1it0/prometheus-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server