Skip to main content
Glama

Inkeep MCP Server

Official
by inkeep
Dockerfile1.07 kB
# Use a Python image with uv pre-installed FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim # Install the project into `/app` WORKDIR /app # Enable bytecode compilation ENV UV_COMPILE_BYTECODE=1 # Copy from the cache instead of linking since it's a mounted volume ENV UV_LINK_MODE=copy COPY pyproject.toml pyproject.toml COPY uv.lock uv.lock # Install the project's dependencies using the lockfile and settings RUN uv sync --frozen --no-install-project --no-dev # Then, add the rest of the project source code and install it # Installing separately from its dependencies allows optimal layer caching ADD . /app RUN uv sync --frozen --no-dev # Place executables in the environment at the front of the path ENV PATH="/app/.venv/bin:$PATH" # Reset the entrypoint, don't invoke `uv` ENTRYPOINT [] # Run the FastAPI application by default # Uses `fastapi dev` to enable hot-reloading when the `watch` sync occurs # Uses `--host 0.0.0.0` to allow access from outside the container CMD ["uvicorn", "--host", "0.0.0.0", "--port", "8080", "inkeep_mcp_server.app:app"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/inkeep/mcp-server-python'

If you have feedback or need assistance with the MCP directory API, please join our Discord server