Skip to main content
Glama

n8n-MCP

by czlonkowski
MIT License
30,809
7,379
  • Apple
  • Linux
docker-compose.test-n8n.yml603 B
# docker-compose.test-n8n.yml - Simple test setup for n8n integration # Run n8n in Docker, n8n-mcp locally for faster testing version: '3.8' services: n8n: image: n8nio/n8n:latest container_name: n8n-test ports: - "5678:5678" environment: - N8N_BASIC_AUTH_ACTIVE=false - N8N_HOST=localhost - N8N_PORT=5678 - N8N_PROTOCOL=http - NODE_ENV=development - N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true volumes: - n8n_test_data:/home/node/.n8n network_mode: "host" # Use host network for easy local testing volumes: n8n_test_data:

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/czlonkowski/n8n-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server