Server Configuration
Describes the environment variables required to run the server.
| Name | Required | Description | Default |
|---|---|---|---|
| OLLAMA_BASE_URL | No | The Ollama endpoint URL | http://localhost:11434 |
Schema
Prompts
Interactive templates invoked by user choice
| Name | Description |
|---|---|
No prompts | |
Resources
Contextual data attached and managed by the client
| Name | Description |
|---|---|
No resources | |
Tools
Functions exposed to the LLM to take actions
| Name | Description |
|---|---|
| consult_ollama | Consult an Ollama model with a prompt and get its response for reasoning from another viewpoint. |
| list_ollama_models | List all available Ollama models on the local instance. |
| compare_ollama_models | Run the same prompt against multiple Ollama models and return their outputs side-by-side for comparison. |
| remember_consult | Store the result of a consult into a local memory store (or configured memory service). |