Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| LLM_MODEL | No | Ollama model for summarization | llama3.2:3b |
| CHUNK_SIZE | No | Text chunking size setting | 1000 |
| LANCEDB_PATH | No | LanceDB storage path | ./vector_index |
| CHUNK_OVERLAP | No | Text chunking overlap setting | 200 |
| WATCH_FOLDERS | Yes | Folders to monitor (comma-separated) | |
| EMBEDDING_MODEL | No | Embedding model (sentence-transformers) | all-MiniLM-L6-v2 |
| FILE_EXTENSIONS | No | File types to index | .pdf,.docx,.doc,.txt,.md,.rtf |
| OLLAMA_BASE_URL | No | Ollama API URL | http://localhost:11434 |
| MAX_FILE_SIZE_MB | No | Maximum file size in MB | 100 |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |