query_local_ai
Leverage local AI models via Ollama to assist with reasoning tasks. Input prompts, adjust model parameters, and receive contextual insights for architecture-focused problem-solving.
Instructions
Query local AI model via Ollama for reasoning assistance
Input Schema
Name | Required | Description | Default |
---|---|---|---|
model | No | Model name (default: architecture-reasoning:latest) | architecture-reasoning:latest |
prompt | Yes | The reasoning prompt to send to local AI | |
temperature | No | Temperature for response (0.1-1.0) |
Input Schema (JSON Schema)
{
"properties": {
"model": {
"default": "architecture-reasoning:latest",
"description": "Model name (default: architecture-reasoning:latest)",
"type": "string"
},
"prompt": {
"description": "The reasoning prompt to send to local AI",
"type": "string"
},
"temperature": {
"default": 0.6,
"description": "Temperature for response (0.1-1.0)",
"type": "number"
}
},
"required": [
"prompt"
],
"type": "object"
}