Skip to main content
Glama

Spider MCP

by Bosegluon2
docker-compose.yml1 kB
version: '3.8' services: spider-mcp: build: . container_name: spider-mcp ports: - "3000:3000" environment: - NODE_ENV=production - PORT=3000 - LOG_LEVEL=info - MAX_CONCURRENT_REQUESTS=3 - REQUEST_TIMEOUT=15000 volumes: - ./logs:/app/logs - ./.env:/app/.env:ro restart: unless-stopped healthcheck: test: ["CMD", "node", "-e", "require('http').get('http://localhost:3000/api/health', (res) => { process.exit(res.statusCode === 200 ? 0 : 1) })"] interval: 30s timeout: 10s retries: 3 start_period: 40s networks: - spider-network # 可选:Redis缓存服务 redis: image: redis:7-alpine container_name: spider-mcp-redis ports: - "6379:6379" volumes: - redis_data:/data restart: unless-stopped networks: - spider-network networks: spider-network: driver: bridge volumes: redis_data:

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Bosegluon2/spider-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server