Skip to main content
Glama

TxtAi Memory Vector Server

dockerstart.sh682 B
#!/bin/bash # Get the project root from this script's location SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" PROJECT_ROOT="$(dirname "$SCRIPT_DIR")" echo "Starting txtai-assistant in Docker..." # Create necessary directories mkdir -p "$PROJECT_ROOT/logs" mkdir -p "$PROJECT_ROOT/data" # Install Python dependencies (no venv) pip install --upgrade pip pip install -r "$PROJECT_ROOT/server/requirements.txt" # Use .env.template if .env is missing if [ ! -f "$PROJECT_ROOT/.env" ]; then echo "No .env file found — copying from template." cp "$PROJECT_ROOT/.env.template" "$PROJECT_ROOT/.env" fi # Launch the app cd "$PROJECT_ROOT/server" python main.py

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/rmtech1/txtai-assistant-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server