Skip to main content
Glama

BCI-MCP Server

by enkhbold470
docker-compose.yml522 B
version: '3' services: bci-mcp-server: build: . container_name: bci-mcp-server ports: - "8765:8765" volumes: - ./recordings:/app/recordings restart: unless-stopped command: python src/main.py --server environment: - PYTHONUNBUFFERED=1 docs: build: . container_name: bci-mcp-docs ports: - "8000:8000" volumes: - .:/app command: mkdocs serve -a 0.0.0.0:8000 depends_on: - bci-mcp-server environment: - PYTHONUNBUFFERED=1

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/enkhbold470/bci-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server