Skip to main content
Glama

Aurora-MCP

by ndaniel
Dockerfileβ€’1.04 kB
# syntax=docker/dockerfile:1 FROM python:3.12-slim WORKDIR /app ENV PYTHONDONTWRITEBYTECODE=1 \ PYTHONUNBUFFERED=1 \ PIP_NO_CACHE_DIR=1 # System deps: build tools only if you really need to compile wheels RUN apt-get update && apt-get install -y --no-install-recommends \ build-essential curl ca-certificates git-lfs \ && rm -rf /var/lib/apt/lists/* # Copy and install Python deps first (better caching) COPY requirements.txt . RUN pip install --upgrade pip && pip install -r requirements.txt # Add the app COPY . . # Ensure Git LFS assets (e.g., Nordic COCONUT dataset) are present RUN git lfs install && git lfs pull # Expose for local runs (HF sets $PORT; expose is informational) EXPOSE 7860 # Hugging Face sets $PORT at runtime; honor it ENV PORT=7860 # Health is nice for Spaces HEALTHCHECK --interval=30s --timeout=5s --start-period=10s --retries=5 \ CMD curl -fsS http://127.0.0.1:${PORT}/healthz || exit 1 # Start the MCP HTTP server CMD ["uvicorn","mcp_server.server:app","--host","0.0.0.0","--port","7860"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ndaniel/aurora-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server