Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| GROUP_ID | No | Tenant identifier. | default |
| NEO4J_URI | No | Connection URI for the Neo4j database. | bolt://localhost:7687 |
| NEO4J_USER | No | Username for the Neo4j database. | neo4j |
| CHUNK_OVERLAP | No | Overlap size between document chunks. | 0 |
| LLM_MODEL_KEY | Yes | API key for the LLM model. | |
| LLM_MODEL_URL | No | Base URL for the OpenAI-compatible LLM API. | https://api.openai.com/v1 |
| CHUNK_SIZE_MAX | No | Maximum size for document chunks. | 2000 |
| CHUNK_SIZE_MIN | No | Minimum size for document chunks. | 200 |
| LLM_MODEL_NAME | No | Model name used for graph search and general processing. | gpt-4o-mini |
| NEO4J_PASSWORD | No | Password for the Neo4j database. | password |
| RERANK_MODEL_NAME | No | Model name used for reranking search results. | gpt-4.1-nano |
| EMBEDDING_MODEL_KEY | No | API key for the embedding model. | dummy |
| EMBEDDING_MODEL_URL | No | Base URL for the OpenAI-compatible Embedding API. | http://host.docker.internal:11434/v1 |
| EMBEDDING_MODEL_NAME | No | Model name for generating embeddings. | kun432/cl-nagoya-ruri-large:latest |
Capabilities
Server capabilities have not been inspected yet.
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
No tools | |
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |