Skip to main content
Glama

n8n-asistans

Dockerfile521 B
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile FROM python:3.11-alpine # Install build dependencies RUN apk add --no-cache gcc musl-dev libffi-dev # Set work directory WORKDIR /app # Copy project files COPY . /app # Upgrade pip and install dependencies RUN pip install --upgrade pip \ && pip install beautifulsoup4 httpx 'mcp[cli]>=1.5.0' python-dotenv # Expose port if needed (MCP runs via stdio, so not mandatory) # Command to run the MCP server CMD ["python", "main.py"]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/onurpolat05/n8n-Assistant'

If you have feedback or need assistance with the MCP directory API, please join our Discord server