Skip to main content
Glama

Self-hosted LLM MCP Server

Dockerfile1.51 kB
# Build stage FROM node:20-alpine AS builder # Set working directory WORKDIR /app # Install dependencies for native modules RUN apk add --no-cache python3 make g++ # Copy package files COPY package*.json ./ # Set environment to development to install all dependencies ENV NODE_ENV=development # Install all dependencies (including dev dependencies for build) RUN npm ci # Install TypeScript globally as backup RUN npm install -g typescript # Copy source code COPY . . # Build the application using npx RUN npx tsc # Production stage FROM node:20-alpine AS production # Set working directory WORKDIR /app # Install dependencies for native modules RUN apk add --no-cache python3 make g++ # Copy package files COPY package*.json ./ # Set environment to production ENV NODE_ENV=production # Install only production dependencies RUN npm ci --only=production && npm cache clean --force # Copy built application from builder stage COPY --from=builder /app/dist ./dist # Create non-root user RUN addgroup -g 1001 -S nodejs RUN adduser -S mcp -u 1001 # Change ownership of the app directory RUN chown -R mcp:nodejs /app USER mcp # Expose port EXPOSE 3000 # Health check HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \ CMD node -e "require('http').get('http://localhost:3000/health', (res) => { process.exit(res.statusCode === 200 ? 0 : 1) })" # Start the application in HTTP mode CMD ["npm", "run", "start:http"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Krishnahuex28/MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server