Skip to main content
Glama

cognee-mcp

entrypoint.sh2.08 kB
#!/bin/bash set -e # Exit on error echo "Debug mode: $DEBUG" echo "Environment: $ENVIRONMENT" # Set default ports if not specified DEBUG_PORT=${DEBUG_PORT:-5678} HTTP_PORT=${HTTP_PORT:-8000} echo "Debug port: $DEBUG_PORT" echo "HTTP port: $HTTP_PORT" # Run Alembic migrations with proper error handling. # Note on UserAlreadyExists error handling: # During database migrations, we attempt to create a default user. If this user # already exists (e.g., from a previous deployment or migration), it's not a # critical error and shouldn't prevent the application from starting. This is # different from other migration errors which could indicate database schema # inconsistencies and should cause the startup to fail. This check allows for # smooth redeployments and container restarts while maintaining data integrity. echo "Running database migrations..." MIGRATION_OUTPUT=$(alembic upgrade head) MIGRATION_EXIT_CODE=$? if [[ $MIGRATION_EXIT_CODE -ne 0 ]]; then if [[ "$MIGRATION_OUTPUT" == *"UserAlreadyExists"* ]] || [[ "$MIGRATION_OUTPUT" == *"User default_user@example.com already exists"* ]]; then echo "Warning: Default user already exists, continuing startup..." else echo "Migration failed with unexpected error." exit 1 fi fi echo "Database migrations done." echo "Starting server..." # Add startup delay to ensure DB is ready sleep 2 # Modified Gunicorn startup with error handling if [ "$ENVIRONMENT" = "dev" ] || [ "$ENVIRONMENT" = "local" ]; then if [ "$DEBUG" = "true" ]; then echo "Waiting for the debugger to attach..." debugpy --wait-for-client --listen 0.0.0.0:$DEBUG_PORT -m gunicorn -w 1 -k uvicorn.workers.UvicornWorker -t 30000 --bind=0.0.0.0:$HTTP_PORT --log-level debug --reload cognee.api.client:app else gunicorn -w 1 -k uvicorn.workers.UvicornWorker -t 30000 --bind=0.0.0.0:$HTTP_PORT --log-level debug --reload cognee.api.client:app fi else gunicorn -w 1 -k uvicorn.workers.UvicornWorker -t 30000 --bind=0.0.0.0:$HTTP_PORT --log-level error cognee.api.client:app fi

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/topoteretes/cognee'

If you have feedback or need assistance with the MCP directory API, please join our Discord server