Skip to main content
Glama

Linear MCP Server

by gerbal
Dockerfile903 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use the official lightweight Node.js image FROM node:20-alpine AS build # Set the working directory WORKDIR /app # Copy the package.json and package-lock.json COPY package*.json ./ # Install dependencies RUN npm install # Copy the source code COPY . . # Build the TypeScript code RUN npm run build # Use a new clean image for the final product FROM node:20-alpine # Set the working directory WORKDIR /app # Copy the build output from the previous stage COPY --from=build /app/build /app/build # Copy the node_modules from the previous stage COPY --from=build /app/node_modules /app/node_modules # Set environment variables ENV LINEAR_API_KEY=your_api_key_here # Expose the port the app runs on (if applicable, specify the correct port) # EXPOSE 3000 # Run the server ENTRYPOINT ["node", "build/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/gerbal/linear-mcp-server-1'

If you have feedback or need assistance with the MCP directory API, please join our Discord server