Skip to main content
Glama

onyx-mcp-server

Dockerfile575 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:lts-alpine # Create app directory WORKDIR /app # Copy package files and tsconfig COPY package*.json ./ COPY tsconfig.json ./ # Copy source code COPY src ./src # Install dependencies (including dev dependencies) without running scripts RUN npm install --ignore-scripts # Build the project RUN npm run build # Optionally expose a port; MCP usually uses stdio, but exposing a port for health checks if needed EXPOSE 3000 # Start the MCP server CMD [ "node", "build/index.js" ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lupuletic/onyx-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server