Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| CHUNK_SIZE | No | Text chunk size | 4096 |
| CHROMA_PATH | No | Vector database storage path | chroma_db |
| EMBED_MODEL | No | Ollama embedding model name | nomic-embed-text |
| DOWNLOAD_DIR | No | Directory for downloaded files | downloads |
| CHUNK_OVERLAP | No | Overlap between chunks | 409 |
| OPENAI_API_KEY | No | OpenAI API key for vector embeddings (required if using OpenAI) | |
| COLLECTION_NAME | No | ChromaDB collection name | documents |
| OLLAMA_BASE_URL | No | Ollama base URL | http://localhost:11434 |
| EMBEDDING_PROVIDER | No | Embedding Provider (openai or ollama) | openai |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |