Skip to main content
Glama

Continuo Memory System

by GtOkAi
run_memory_server.sh1.17 kB
#!/bin/bash # Script to run MCP Memory Server # Usage: ./run_memory_server.sh [local|openai] [api-key] [db-path] set -e PROVIDER=${1:-local} API_KEY=$2 DB_PATH=${3:-./cursor_memory_db} echo "==================================================" echo "🚀 Starting MCP Memory Server" echo "==================================================" echo "Provider: $PROVIDER" echo "DB Path: $DB_PATH" echo "==================================================" # Locate venv Python if [ -f "venv_memory/bin/python" ]; then PYTHON_BIN="venv_memory/bin/python" elif [ -f "venv_memory/bin/python3" ]; then PYTHON_BIN="venv_memory/bin/python3" else echo "❌ Virtual environment not found!" echo "Run: ./scripts/setup_memory.sh" exit 1 fi echo "Python: $PYTHON_BIN" echo "" # Build command CMD="$PYTHON_BIN continuo/mcp_memory_server.py --provider $PROVIDER --db-path $DB_PATH" if [ "$PROVIDER" = "openai" ]; then if [ -z "$API_KEY" ]; then echo "❌ API key required for provider=openai" echo "Usage: $0 openai YOUR_API_KEY" exit 1 fi CMD="$CMD --api-key $API_KEY" fi echo "Running: $CMD" echo "" # Run server exec $CMD

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/GtOkAi/continuo-memory-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server