query_local_ai
Leverage local AI models via Ollama to assist with reasoning tasks. Input prompts, adjust model parameters, and receive contextual insights for architecture-focused problem-solving.
Instructions
Query local AI model via Ollama for reasoning assistance
Input Schema
Name | Required | Description | Default |
---|---|---|---|
model | No | Model name (default: architecture-reasoning:latest) | architecture-reasoning:latest |
prompt | Yes | The reasoning prompt to send to local AI | |
temperature | No | Temperature for response (0.1-1.0) |