Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| TOP_K | No | Number of top results to return | 5 |
| MCP_PORT | No | Port for MCP SSE service | 8000 |
| NEO4J_URI | No | Neo4j database connection URI | bolt://localhost:7687 |
| CHUNK_SIZE | No | Chunk size for document processing | 512 |
| NEO4J_USER | No | Neo4j database username | neo4j |
| OLLAMA_HOST | No | Ollama server host URL | http://localhost:11434 |
| WEB_UI_PORT | No | Port for Web UI and REST API | 8080 |
| LLM_PROVIDER | No | LLM provider (ollama, openai, gemini, openrouter) | ollama |
| OLLAMA_MODEL | No | Ollama model to use | llama3.2 |
| CHUNK_OVERLAP | No | Chunk overlap for document processing | 50 |
| NEO4J_DATABASE | No | Neo4j database name | neo4j |
| NEO4J_PASSWORD | Yes | Neo4j database password | |
| VECTOR_DIMENSION | No | Vector dimension for embeddings | 384 |
| EMBEDDING_PROVIDER | No | Embedding provider (ollama, openai, gemini, openrouter) | ollama |
| OLLAMA_EMBEDDING_MODEL | No | Ollama embedding model to use | nomic-embed-text |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |