Skip to main content
Glama

DataDog MCP Server

by Believe-SA

DataDog MCP Server

A Model Context Protocol (MCP) server that provides AI assistants with direct access to DataDog's observability platform through a standardized interface.

🎯 Purpose

This server bridges the gap between Large Language Models (LLMs) and DataDog's comprehensive observability platform, enabling AI assistants to:

  • Monitor Infrastructure: Query dashboards, metrics, and host status

  • Manage Events: Create and retrieve events for incident tracking

  • Analyze Data: Access logs, traces, and performance metrics

  • Automate Operations: Interact with monitors, downtimes, and alerts

🔧 What is MCP?

The Model Context Protocol (MCP) is a standardized way for AI assistants to interact with external tools and data sources. Instead of each AI system building custom integrations, MCP provides a common interface that allows LLMs to:

  • Execute tools with structured inputs and outputs

  • Access real-time data from external systems

  • Maintain context across multiple tool calls

  • Provide consistent, reliable integrations

📊 DataDog Platform

DataDog is a leading observability platform that provides:

  • Infrastructure Monitoring: Track server performance, resource usage, and health

  • Application Performance Monitoring (APM): Monitor application performance and user experience

  • Log Management: Centralized logging with powerful search and analysis

  • Real User Monitoring (RUM): Track user interactions and frontend performance

  • Security Monitoring: Detect threats and vulnerabilities across your infrastructure

🚀 Quick Start

  1. Build the server:

    make build
  2. Configure DataDog API:

    export DD_API_KEY="your-datadog-api-key" export DATADOG_APP_KEY="your-datadog-app-key" # Optional export DATADOG_SITE="datadoghq.eu" # or datadoghq.com
  3. Generate MCP configuration:

    make create-mcp-config
  4. Run the server:

    ./build/datadog-mcp-server

📚 Documentation

🛠️ Available Tools

Currently implemented tools include:

  • Dashboard Management (v1): v1_list_dashboards, v1_get_dashboard

  • Event Management (v1): v1_list_events, v1_create_event

  • Connection Testing (v1): v1_test_connection

  • Monitor Management (v1): (Coming soon)

  • Metrics & Logs (v1): (Coming soon)

All tools are prefixed with their API version (e.g., v1_, v2_) for clear segregation and future v2 API support.

See docs/tools.md for the complete list and implementation status.

🔧 Development

# Install development tools make install-dev-tools # Run tests make test # Generate API client make generate # Split OpenAPI specifications make split # Lint OpenAPI specifications make lint-openapi # Build and test make build

OpenAPI Management

The project includes comprehensive tools for managing OpenAPI specifications:

  • Split Specifications: Break down large OpenAPI files into smaller, manageable pieces

  • Spectral Linting: Validate OpenAPI specifications with custom rules and best practices

  • Code Generation: Generate Go client code from OpenAPI specifications

  • Version Support: Separate handling for DataDog API v1 and v2

See OpenAPI Splitting Guide and Spectral Linting Guide for detailed usage.

📚 Resources

-
security - not tested
F
license - not found
-
quality - not tested

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Believe-SA/datadog-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server