local_llm_chat
Chat with local Ollama AI models using natural language queries for private, offline conversations and tasks.
Instructions
Chat with a local Ollama model
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| message | Yes | Message to send to the model | |
| model | No | Model name (optional, uses first available if not specified) | |
| temperature | No | Generation temperature 0.0-1.0 (default: 0.7) |