Skip to main content
Glama
Dockerfile500 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:lts-alpine # Create app directory WORKDIR /usr/src/app # Install app dependencies COPY package*.json ./ RUN npm install --ignore-scripts # Copy the rest of the app source COPY . . # Build the project RUN npm run build # Expose the port if necessary (MCP servers may not need a port, but can expose a port if using HTTP transport) # EXPOSE 3000 # Start the MCP server CMD [ "node", "build/index.js" ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Leghis/smart-e2b'

If you have feedback or need assistance with the MCP directory API, please join our Discord server