Skip to main content
Glama

MCP LLMS.txt Explorer

# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM node:lts-alpine # Create app directory WORKDIR /app # Install pnpm RUN npm install -g pnpm # Copy package files COPY package.json pnpm-lock.yaml ./ # Install dependencies without running prepare scripts (we run build explicitly) RUN pnpm install --frozen-lockfile --ignore-scripts # Copy rest of the source COPY . . # Build the project RUN pnpm run build # Expose any necessary port if needed, but MCP uses stdio so not required # Set the entry point CMD [ "node", "build/index.js" ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/thedaviddias/mcp-llms-txt-explorer'

If you have feedback or need assistance with the MCP directory API, please join our Discord server