Skip to main content
Glama

GPT Image MCP Server

by lansespirit
Dockerfile1.78 kB
# Multi-stage build for optimized production image FROM python:3.11-slim as builder # Install system dependencies RUN apt-get update && apt-get install -y \ curl \ build-essential \ && rm -rf /var/lib/apt/lists/* # Install UV package manager RUN pip install uv # Set working directory WORKDIR /app # Copy dependency files COPY pyproject.toml uv.lock README.md ./ # Install dependencies only (do not install the project itself for better caching) # This avoids editable build of the local package before sources are copied RUN uv sync --frozen --no-install-project # Production stage FROM python:3.11-slim # Install runtime dependencies RUN apt-get update && apt-get install -y \ curl \ && rm -rf /var/lib/apt/lists/* # Create non-root user for security RUN groupadd -r appuser && useradd -r -g appuser appuser # Set working directory WORKDIR /app # Copy virtual environment from builder COPY --from=builder /app/.venv /app/.venv # Copy application code COPY . . # Create required directories with proper permissions RUN mkdir -p /app/storage/images /app/storage/cache /app/storage/logs && \ chown -R appuser:appuser /app # Set environment variables for virtual environment activation ENV VIRTUAL_ENV="/app/.venv" ENV PATH="/app/.venv/bin:$PATH" ENV PYTHONPATH="/app" ENV PYTHONDONTWRITEBYTECODE=1 ENV PYTHONUNBUFFERED=1 # Switch to non-root user USER appuser # Health check - simple HTTP endpoint check for streamable-http transport HEALTHCHECK --interval=30s --timeout=10s --start-period=30s --retries=3 \ CMD python -c "import sys, urllib.request; sys.exit(0 if urllib.request.urlopen('http://localhost:3001').getcode() == 200 else 1)" || exit 1 # Expose port EXPOSE 3001 # Start command CMD ["python", "-m", "image_gen_mcp.server"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/lansespirit/gpt-image-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server