Skip to main content
Glama

MCP Brain Service

by jomapps
Dockerfile722 B
# syntax=docker/dockerfile:1 FROM python:3.12-slim AS base ENV PYTHONDONTWRITEBYTECODE=1 \ PYTHONUNBUFFERED=1 \ PIP_NO_CACHE_DIR=1 \ PORT=8002 WORKDIR /app # System deps (add as needed for libs that require compilation) RUN apt-get update -y && apt-get install -y --no-install-recommends \ ca-certificates build-essential curl && \ rm -rf /var/lib/apt/lists/* # Install Python deps COPY requirements.txt ./ RUN pip install --upgrade pip && pip install -r requirements.txt # Copy app source COPY src ./src COPY pyproject.toml ./ EXPOSE 8002 # Default command (production-friendly); dev compose overrides with --reload CMD ["sh", "-c", "uvicorn src.main:app --host 0.0.0.0 --port ${PORT:-8002}"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/jomapps/mcp-brain-service'

If you have feedback or need assistance with the MCP directory API, please join our Discord server