Skip to main content
Glama

Files-DB-MCP

by randomm
docker-compose.ci.yml1.7 kB
version: '3.8' services: vector-db: image: qdrant/qdrant:latest volumes: - qdrant_data:/qdrant/storage healthcheck: test: ["CMD", "curl", "-f", "http://localhost:6333/readiness"] interval: 5s timeout: 5s retries: 5 start_period: 5s files-db-mcp: build: context: . dockerfile: ./Dockerfile depends_on: vector-db: condition: service_healthy healthcheck: test: ["CMD", "curl", "-f", "http://localhost:8000/health"] interval: 30s timeout: 30s retries: 10 start_period: 600s # Give time for model downloads environment: - VECTOR_DB_HOST=vector-db - VECTOR_DB_PORT=6333 - EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2 - QUANTIZATION=true - BINARY_EMBEDDINGS=false - DEBUG=false - PROJECT_PATH=/test-project - IGNORE_PATTERNS=.git,node_modules,__pycache__,venv,dist,build,*.pyc - PORT=8000 - FAST_STARTUP=true ports: - "3000:8000" # Map container port 8000 to host port 3000 volumes: - ./tests/fixtures/project:/test-project:ro - app_data:/app/data - model_cache:/root/.cache/huggingface/hub tests: build: context: . dockerfile: ./Dockerfile.test depends_on: files-db-mcp: condition: service_healthy environment: - VECTOR_DB_HOST=vector-db - VECTOR_DB_PORT=6333 - MCP_HOST=files-db-mcp - MCP_PORT=3000 volumes: - ./tests:/app/tests:ro - ./coverage:/app/coverage command: ["pytest", "--cov=src", "--cov-report=xml:/app/coverage/coverage.xml"] volumes: qdrant_data: app_data: model_cache:

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/randomm/files-db-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server