Skip to main content
Glama

MCP Server Gemini

by gurr-i
Dockerfileโ€ข828 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use a Node.js image FROM node:18-alpine AS builder # Set the working directory WORKDIR /app # Copy package.json and package-lock.json COPY package.json package-lock.json ./ # Install dependencies RUN npm install # Copy the rest of the application code COPY . . # Build the application RUN npm run build # Use a smaller Node.js image for the release FROM node:18-slim AS release # Set the working directory WORKDIR /app # Copy the build from the builder stage COPY --from=builder /app/dist /app/dist COPY --from=builder /app/package.json /app/package-lock.json /app/ # Install production dependencies only RUN npm ci --production # Expose the necessary port EXPOSE 3005 # Command to run the application CMD ["node", "dist/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gurr-i/mcp-server-gemini-pro'

If you have feedback or need assistance with the MCP directory API, please join our Discord server