Skip to main content
Glama

Turbify Store MCP Server

by benpeke
Dockerfile1.28 kB
# Use Python 3.12 slim image as base and copy uv from official image FROM python:3.12-slim-bookworm COPY --from=ghcr.io/astral-sh/uv:0.8.4 /uv /uvx /bin/ # Install the project into `/app` WORKDIR /app # Enable bytecode compilation ENV UV_COMPILE_BYTECODE=1 # Copy from the cache instead of linking since it's a mounted volume ENV UV_LINK_MODE=copy # Set environment variables ENV PYTHONDONTWRITEBYTECODE=1 \ PYTHONUNBUFFERED=1 \ PYTHONPATH=/app # Install the project's dependencies using the lockfile and settings RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --locked --no-install-project --no-dev # Then, add the rest of the project source code and install it # Installing separately from its dependencies allows optimal layer caching COPY . /app RUN --mount=type=cache,target=/root/.cache/uv \ uv sync --locked --no-dev # Expose port for HTTP transport (when DEPLOYMENT_MODE=http) EXPOSE 8000 # Place executables in the environment at the front of the path ENV PATH="/app/.venv/bin:$PATH" # Reset the entrypoint, don't invoke `uv` ENTRYPOINT [] # Run the server directly from the venv CMD ["uv", "run", "turbify-mcp"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/benpeke/turbify_store_mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server