Skip to main content
Glama

File Context MCP

# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile # Use an official Node.js runtime as a parent image FROM node:18-slim AS builder # Set the working directory WORKDIR /app # Copy the package.json and package-lock.json files to the working directory COPY package*.json ./ # Install dependencies RUN npm install # Copy the rest of the application code to the working directory COPY . . # Build the application RUN npm run build # Use a lighter weight Node.js runtime for the production environment FROM node:18-slim # Set the working directory WORKDIR /app # Copy the built application code and necessary files from the builder stage COPY --from=builder /app/dist ./dist COPY --from=builder /app/package.json ./package.json COPY --from=builder /app/package-lock.json ./package-lock.json # Install production dependencies RUN npm install --only=production # Expose the application port EXPOSE 3001 # Define environment variables ENV PORT=3001 # Run the application CMD ["node", "dist/server.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/compiledwithproblems/file-context-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server