Skip to main content
Glama

mcp-server-datadog

# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:22.12-alpine AS builder # Install pnpm globally RUN npm install -g pnpm@10 WORKDIR /app # Copy package files and install dependencies COPY package.json pnpm-lock.yaml ./ RUN pnpm install --frozen-lockfile --ignore-scripts # Copy the rest of the files COPY . . # Build the project RUN pnpm build FROM node:22.12-alpine AS installer # Install pnpm globally RUN npm install -g pnpm@10 WORKDIR /app # Copy package files and install only production dependencies COPY package.json pnpm-lock.yaml ./ RUN pnpm install --frozen-lockfile --ignore-scripts --prod FROM node:22.12-alpine AS release WORKDIR /app COPY --from=builder /app/build /app/build COPY --from=installer /app/node_modules /app/node_modules # Expose port if needed (Not explicitly mentioned, MCP runs via stdio, so not needed) CMD ["node", "build/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/winor30/mcp-server-datadog'

If you have feedback or need assistance with the MCP directory API, please join our Discord server