switch_backend
Change the active LLM backend for AI task routing. Specify a backend ID to switch between different local models like Ollama, llama.cpp, or Gemini for processing tasks.
Instructions
Switch the active LLM backend.
Args: backend_id: ID of the backend to switch to (from settings.json)
Returns: Confirmation message with current status
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| backend_id | Yes |