Skip to main content
Glama

Reddit MCP Server

by ozipi
docker-compose.yml560 B
version: '3.8' services: mcp-server-full: build: context: . args: PORT: ${PORT:-3000} ports: - "${PORT:-3000}:${PORT:-3000}" env_file: - .env environment: - NODE_ENV=development healthcheck: test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:${PORT:-3000}/health"] interval: 30s timeout: 10s retries: 3 start_period: 40s restart: unless-stopped volumes: # Optional: mount logs directory for debugging - ./logs:/app/logs

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ozipi/brainloop-mcp-server-v2'

If you have feedback or need assistance with the MCP directory API, please join our Discord server