Skip to main content
Glama

Metabase AI Assistant

docker-compose.yml807 B
version: '3.8' services: metabase-ai-mcp: build: . container_name: metabase-ai-mcp-server restart: unless-stopped environment: - NODE_ENV=production - LOG_LEVEL=info - METABASE_URL=${METABASE_URL} - METABASE_USERNAME=${METABASE_USERNAME} - METABASE_PASSWORD=${METABASE_PASSWORD} - METABASE_API_KEY=${METABASE_API_KEY} - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} - OPENAI_API_KEY=${OPENAI_API_KEY} volumes: - ./logs:/app/logs - ./.env:/app/.env:ro networks: - mcp-network healthcheck: test: ["CMD", "node", "-e", "console.log('Health check')"] interval: 30s timeout: 10s retries: 3 start_period: 40s networks: mcp-network: driver: bridge volumes: mcp-logs: driver: local

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/enessari/metabase-ai-assistant'

If you have feedback or need assistance with the MCP directory API, please join our Discord server