Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
AWS_PROFILENoAWS credentials profile from ~/.aws/credentials (useful for R2)
MEMORA_TAGSNoComma-separated list of allowed tags
MEMORA_DB_PATHNoLocal SQLite database path~/.local/share/memora/memories.db
OPENAI_API_KEYNoAPI key for OpenAI embeddings (required when using openai backend)
MEMORA_TAG_FILENoPath to file containing allowed tags (one per line)
AWS_ENDPOINT_URLNoS3-compatible endpoint for R2/MinIO
MEMORA_CACHE_DIRNoLocal cache directory for cloud-synced database
R2_PUBLIC_DOMAINNoPublic domain for R2 image URLs
MEMORA_GRAPH_PORTNoPort for the knowledge graph visualization server8765
MEMORA_STORAGE_URINoCloud storage URI for S3/R2 (e.g., s3://bucket/memories.db)
MEMORA_ALLOW_ANY_TAGNoAllow any tag without validation against allowlist (1 to enable)
MEMORA_CLOUD_ENCRYPTNoEncrypt database before uploading to cloud (true/false)
MEMORA_CLOUD_COMPRESSNoCompress database before uploading to cloud (true/false)
MEMORA_EMBEDDING_MODELNoEmbedding backend: tfidf (default), sentence-transformers, or openaitfidf
OPENAI_EMBEDDING_MODELNoOpenAI embedding modeltext-embedding-3-small
SENTENCE_TRANSFORMERS_MODELNoModel for sentence-transformersall-MiniLM-L6-v2

Tools

Functions exposed to the LLM to take actions

NameDescription

No tools

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/spokV/memora'

If you have feedback or need assistance with the MCP directory API, please join our Discord server