Skip to main content
Glama
docker-compose.yml619 B
version: '3.8' services: qdrant: image: qdrant/qdrant:latest ports: - "6333:6333" - "6334:6334" volumes: - qdrant_data:/qdrant/storage environment: - QDRANT__SERVICE__GRPC_PORT=6334 restart: unless-stopped embeddings: image: ghcr.io/huggingface/text-embeddings-inference:cpu-1.5 command: --model-id nomic-ai/nomic-embed-text-v1.5 --port 80 ports: - "8080:80" volumes: - embeddings_cache:/data environment: - HUGGING_FACE_HUB_TOKEN=${HUGGING_FACE_HUB_TOKEN:-} restart: unless-stopped volumes: qdrant_data: embeddings_cache:

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/docleaai/doclea-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server