Skip to main content
Glama

switch_backend

Change the active LLM backend for AI task routing. Specify a backend ID to switch between different local models like Ollama, llama.cpp, or Gemini for processing tasks.

Instructions

Switch the active LLM backend.

Args: backend_id: ID of the backend to switch to (from settings.json)

Returns: Confirmation message with current status

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
backend_idYes

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/zbrdc/delia'

If you have feedback or need assistance with the MCP directory API, please join our Discord server