Skip to main content
Glama

BigBugAI MCP Server

by bigbugAi
Dockerfile1.08 kB
# BigBugAI MCP - Docker image # Builds a containerized MCP server that runs over STDIO by default. FROM python:3.11-slim AS runtime ENV PYTHONUNBUFFERED=1 \ PYTHONDONTWRITEBYTECODE=1 \ PIP_NO_CACHE_DIR=1 # System deps (keep minimal) RUN apt-get update \ && apt-get install -y --no-install-recommends ca-certificates \ && rm -rf /var/lib/apt/lists/* WORKDIR /app # Copy only what’s needed to install the package COPY pyproject.toml README.md uv.lock ./ COPY src ./src # Install the package (and runtime deps declared in pyproject) RUN pip install --upgrade pip \ && pip install . # OCI labels LABEL org.opencontainers.image.source="https://github.com/bigbugAi/bigbugai-mcp" \ org.opencontainers.image.licenses="MIT" \ org.opencontainers.image.title="BigBugAI MCP" \ org.opencontainers.image.description="MCP server exposing BigBugAI tools for trending tokens and token analysis." # Default envs ENV BTUNIFIED_API="https://api.bigbug.ai" # Default entrypoint: STDIO MCP server ENTRYPOINT ["python", "-m", "bigbugai_mcp.server_stdio"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/bigbugAi/bigbugai-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server