start_ollama_server
Start the Ollama server to enable local LLM management and interactions when it's not running.
Instructions
Attempt to start Ollama server if it's not running
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||