Skip to main content
Glama

MCP Gemini Server

by bsmi021
Dockerfile476 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:lts-alpine WORKDIR /app # Copy package files and install dependencies COPY package.json package-lock.json ./ # Install dependencies without running lifecycle scripts RUN npm ci --ignore-scripts # Copy the rest of the app COPY . . # Build the TypeScript code RUN npm run build # Expose port if needed (optional) # EXPOSE 3000 # Start the server CMD ["node", "dist/server.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-gemini-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server