Skip to main content
Glama

iTerm MCP

by lite
Dockerfile929 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use an official Node runtime as a parent image FROM node:18-alpine AS builder # Set the working directory WORKDIR /app # Copy the package.json and yarn.lock COPY package.json yarn.lock ./ # Install dependencies RUN yarn install --frozen-lockfile # Copy the rest of the application code COPY . . # Build the TypeScript application RUN yarn run build # Start a new stage from the official Node.js image FROM node:18-alpine # Set the working directory WORKDIR /app # Copy only the built files from the builder stage COPY --from=builder /app/build ./build COPY --from=builder /app/package.json ./ COPY --from=builder /app/yarn.lock ./ # Install production dependencies only RUN yarn install --production --frozen-lockfile # Expose port if needed (example: 3000) # EXPOSE 3000 # Run the application ENTRYPOINT ["node", "build/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lite/iterm-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server