consult_ollama
Consult local Ollama models to generate responses from alternative viewpoints by providing prompts and optional system instructions.
Instructions
Consult an Ollama model with a prompt and get its response for reasoning from another viewpoint.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes | ||
| prompt | Yes | ||
| system_prompt | No |
Input Schema (JSON Schema)
{
"properties": {
"model": {
"type": "string"
},
"prompt": {
"type": "string"
},
"system_prompt": {
"type": "string"
}
},
"required": [
"model",
"prompt"
],
"type": "object"
}