Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| AWS_PROFILE | No | AWS credentials profile from ~/.aws/credentials (useful for R2) | |
| MEMORA_TAGS | No | Comma-separated list of allowed tags | |
| MEMORA_DB_PATH | No | Local SQLite database path | ~/.local/share/memora/memories.db |
| OPENAI_API_KEY | No | API key for OpenAI embeddings (required when using openai backend) | |
| MEMORA_TAG_FILE | No | Path to file containing allowed tags (one per line) | |
| AWS_ENDPOINT_URL | No | S3-compatible endpoint for R2/MinIO | |
| MEMORA_CACHE_DIR | No | Local cache directory for cloud-synced database | |
| R2_PUBLIC_DOMAIN | No | Public domain for R2 image URLs | |
| MEMORA_GRAPH_PORT | No | Port for the knowledge graph visualization server | 8765 |
| MEMORA_STORAGE_URI | No | Cloud storage URI for S3/R2 (e.g., s3://bucket/memories.db) | |
| MEMORA_ALLOW_ANY_TAG | No | Allow any tag without validation against allowlist (1 to enable) | |
| MEMORA_CLOUD_ENCRYPT | No | Encrypt database before uploading to cloud (true/false) | |
| MEMORA_CLOUD_COMPRESS | No | Compress database before uploading to cloud (true/false) | |
| MEMORA_EMBEDDING_MODEL | No | Embedding backend: tfidf (default), sentence-transformers, or openai | tfidf |
| OPENAI_EMBEDDING_MODEL | No | OpenAI embedding model | text-embedding-3-small |
| SENTENCE_TRANSFORMERS_MODEL | No | Model for sentence-transformers | all-MiniLM-L6-v2 |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |