Skip to main content
Glama

Blockscout MCP Server

Official
docker-compose.yml898 B
services: mcp-server: image: ghcr.io/blockscout/mcp-server:latest command: python -m blockscout_mcp_server --http --rest --http-host 0.0.0.0 --http-port 8080 ports: - "8080:8080" restart: unless-stopped healthcheck: test: ["CMD-SHELL", "python -c \"import urllib.request; urllib.request.urlopen('http://localhost:8080/health')\""] interval: 60s timeout: 5s retries: 5 start_period: 30s evaluation: image: us-docker.pkg.dev/gemini-code-dev/gemini-cli/sandbox:${GEMINI_CLI_DOCKER_IMAGE_VERSION:-0.2.0} command: gemini --yolo volumes: - ${HOME}/.gemini:/home/node/.gemini - .:/workspace/mcp-server-evals working_dir: /workspace/mcp-server-evals stdin_open: true tty: true depends_on: mcp-server: condition: service_healthy environment: - MCP_SERVER_URL=http://mcp-server:8080

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/blockscout/mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server