Skip to main content
Glama

Raccoon AI MCP Server

Official
by raccoonaihq
Dockerfile759 B
FROM ghcr.io/astral-sh/uv:python3.12-bookworm-slim AS uv WORKDIR /app ENV UV_COMPILE_BYTECODE=1 ENV UV_LINK_MODE=copy RUN --mount=type=cache,target=/root/.cache/uv \ --mount=type=bind,source=uv.lock,target=uv.lock \ --mount=type=bind,source=pyproject.toml,target=pyproject.toml \ uv sync --frozen --no-install-project --no-dev --no-editable ADD . /app RUN --mount=type=cache,target=/root/.cache/uv \ uv sync --frozen --no-dev --no-editable FROM python:3.12-slim-bookworm WORKDIR /app COPY --from=uv --chown=app:app /app/.venv /app/.venv ENV PATH="/app/.venv/bin:$PATH" RUN pip install uv ENV RACCOON_SECRET_KEY=<raccoon_secret_key> ENV RACCOON_PASSCODE=<raccoon_passcode> COPY . /app CMD ["uv", "run" ,"-m", "raccoonai_mcp_server"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/raccoonaihq/raccoonai-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server