Skip to main content
Glama

MCP Server with OpenAI Integration

by code-wgl
Dockerfile411 B
# syntax=docker/dockerfile:1.7 FROM node:20-slim AS base WORKDIR /app FROM base AS deps COPY package*.json ./ RUN npm install --omit=dev FROM deps AS build RUN npm install --include=dev COPY . . RUN npm run build FROM base AS runtime ENV NODE_ENV=production WORKDIR /app COPY --from=deps /app/node_modules ./node_modules COPY --from=build /app/dist ./dist COPY package*.json ./ CMD ["node", "dist/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/code-wgl/McpServer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server