Server Configuration
Describes the environment variables required to run the server.
Name | Required | Description | Default |
---|---|---|---|
No arguments |
Schema
Prompts
Interactive templates invoked by user choice
Name | Description |
---|---|
No prompts |
Resources
Contextual data attached and managed by the client
Name | Description |
---|---|
No resources |
Tools
Functions exposed to the LLM to take actions
Name | Description |
---|---|
query_local_ai | Query local AI model via Ollama for reasoning assistance |
reasoning_assist | Structured reasoning assistance for complex problems |
model_list | List available local AI models |
hybrid_analysis | Hybrid local+cloud analysis for complex data |
token_efficient_reasoning | Delegate heavy reasoning to local AI to conserve cloud tokens |