Skip to main content
Glama
Dockerfile596 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:lts-alpine # Set working directory WORKDIR /app # Copy package files and install dependencies COPY package*.json ./ # Install dependencies, ignoring scripts so as not to run prepare automatically RUN npm install --ignore-scripts # Build the project COPY tsconfig.json ./ COPY src ./src RUN npm run build # Expose port if needed (the inspector uses port 3001 as configuration, but the MCP likely uses a different port). # EXPOSE 3001 # Command to start the server CMD [ "node", "build/index.js" ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/meowrain/mcp-server-cambridge-dict'

If you have feedback or need assistance with the MCP directory API, please join our Discord server