Skip to main content
Glama

MCP Grareco

by iuill
docker-compose.yml733 B
services: mcp_grareco: build: context: . dockerfile: Dockerfile container_name: mcp_grareco ports: - "${PORT:-3000}:${PORT:-3000}" environment: - PORT=${PORT:-3000} - HOST=${HOST:-0.0.0.0} - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} - ANTHROPIC_MODEL=${ANTHROPIC_MODEL:-claude-3-sonnet-20240229} - ANTHROPIC_MAX_OUTPUT_TOKENS=${ANTHROPIC_MAX_OUTPUT_TOKENS:-8192} - ANTHROPIC_MAX_CONTEXT_WINDOW=${ANTHROPIC_MAX_CONTEXT_WINDOW:-200000} - ANTHROPIC_MAX_INPUT_TOKENS=${ANTHROPIC_MAX_INPUT_TOKENS:-50000} - GRARECO_OUTPUT_DIR=/app/output - NODE_ENV=production volumes: - ./output:/app/output restart: unless-stopped

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/iuill/mcp_grareco'

If you have feedback or need assistance with the MCP directory API, please join our Discord server