Skip to main content
Glama

cognee-mcp

docker-compose-helm.yml889 B
services: cognee: image : cognee-backend:latest container_name: cognee-backend networks: - cognee-network build: context: . dockerfile: Dockerfile volumes: - .:/app - /app/cognee-frontend/ # Ignore frontend code environment: - HOST=0.0.0.0 - ENVIRONMENT=local - PYTHONPATH=. ports: - 8000:8000 # - 5678:5678 # Debugging deploy: resources: limits: cpus: '4.0' memory: 8GB postgres: image: pgvector/pgvector:pg17 container_name: postgres environment: POSTGRES_USER: cognee POSTGRES_PASSWORD: cognee POSTGRES_DB: cognee_db volumes: - postgres_data:/var/lib/postgresql/data ports: - 5432:5432 networks: - cognee-network networks: cognee-network: name: cognee-network volumes: postgres_data:

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

If you have feedback or need assistance with the MCP directory API, please join our Discord server