Skip to main content
Glama

MCP File Context Server

by bsmi021
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:lts-alpine WORKDIR /app # Copy package files COPY package*.json ./ # Install dependencies RUN npm install # Copy the rest of the project COPY . . # Build the project RUN npm run build # Set environment variables defaults ENV MAX_CACHE_SIZE=1000 ENV CACHE_TTL=3600000 ENV MAX_FILE_SIZE=1048576 # Start the server CMD ["node", "build/index.js"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bsmi021/mcp-file-context-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server