Skip to main content
Glama

Substrate

by ivan-saorin
Dockerfile809 B
# Multi-stage build for substrate MCP server FROM python:3.11-slim AS builder WORKDIR /build # Copy requirements first for better caching COPY requirements.txt . RUN pip install --no-cache-dir --user -r requirements.txt # Production stage FROM python:3.11-slim WORKDIR /app # Copy Python dependencies COPY --from=builder /root/.local /root/.local # Make sure scripts in .local are usable ENV PATH=/root/.local/bin:$PATH # Copy application code COPY src/ ./src/ COPY main.py . # Create data directory RUN mkdir -p /app/data/refs # Environment variables for instance configuration ENV PYTHONUNBUFFERED=1 ENV MCP_TRANSPORT=stdio ENV INSTANCE_TYPE=substrate ENV INSTANCE_DESCRIPTION="Cognitive manipulation substrate" ENV DATA_DIR=/app/data ENV LOG_LEVEL=INFO # Run the server CMD ["python", "main.py"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ivan-saorin/substrate'

If you have feedback or need assistance with the MCP directory API, please join our Discord server