Skip to main content
Glama

context-awesome

by bh-rat
Dockerfile744 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # ----- Build Stage ----- FROM node:lts-alpine AS builder WORKDIR /app # Copy package files COPY package*.json ./ COPY tsconfig.json ./ # Install dependencies RUN npm ci # Copy source code COPY src ./src # Build the project RUN npm run build # ----- Production Stage ----- FROM node:lts-alpine WORKDIR /app # Copy built artifacts COPY --from=builder /app/build ./build # Copy package files for production COPY package*.json ./ # Install only production dependencies RUN npm ci --omit=dev # Expose HTTP port EXPOSE 8080 # Start the server with HTTP transport CMD ["node", "build/index.js", "--transport", "http", "--host", "0.0.0.0", "--port", "8080"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bh-rat/context-awesome'

If you have feedback or need assistance with the MCP directory API, please join our Discord server