Skip to main content
Glama

MetaMCP MCP Server

by metatool-ai
services: mcp-server: build: context: . dockerfile: Dockerfile ports: - "3000:3000" env_file: - .env.production.local entrypoint: ["/bin/bash"] command: ["-c", "uvx --version && echo 'uvx is working!' && tail -f /dev/null"] healthcheck: test: ["CMD", "ps", "aux", "|", "grep", "tail"] interval: 30s timeout: 10s retries: 3 environment: - NODE_ENV=production restart: unless-stopped # Add any additional environment variables or command arguments here # command: --metamcp-api-key your-api-key --metamcp-api-base-url your-base-url

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/metatool-ai/mcp-server-metamcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server