Skip to main content
Glama

any-chat-completions-mcp

by pyroprompts
Dockerfile1.02 kB
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use a Node.js image for building the TypeScript code FROM node:18-alpine AS builder # Set the working directory WORKDIR /app # Copy package.json and package-lock.json COPY package.json package-lock.json ./ # Install dependencies RUN npm install --ignore-scripts # Copy the rest of the application code COPY . . # Build the TypeScript code RUN npm run build # Use a smaller Node.js image for the runtime FROM node:18-alpine AS runner # Set the working directory WORKDIR /app # Copy the built application from the builder stage COPY --from=builder /app/build ./build COPY --from=builder /app/package.json ./ COPY --from=builder /app/node_modules ./node_modules # Set environment variables for the server ENV AI_CHAT_KEY=your_api_key ENV AI_CHAT_NAME=OpenAI ENV AI_CHAT_MODEL=gpt-4o ENV AI_CHAT_BASE_URL=https://api.openai.com/v1 # Expose the necessary port EXPOSE 3000 # Start the application CMD ["node", "build/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/pyroprompts/any-chat-completions-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server