Skip to main content
Glama

ollama_list_models

Retrieve available Ollama AI models for integration with macOS applications through natural language commands.

Instructions

Получает список доступных моделей Ollama

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • TypeScript handler function that fetches the list of available Ollama models from the /api/tags endpoint, formats them with sizes, and returns a structured response. Handles connection errors and empty lists.
    private async ollamaListModels() { try { const response = await fetch(`${OLLAMA_API_URL}/api/tags`, { method: "GET", headers: { "Content-Type": "application/json", }, }); if (!response.ok) { const errorText = await response.text(); throw new Error( `Ollama API error: ${response.status} ${errorText}` ); } const data = (await response.json()) as { models?: Array<{ name: string; size: number }> }; const models = data.models || []; if (models.length === 0) { return { content: [ { type: "text", text: "Нет доступных моделей. Загрузите модель: ollama pull llama3.2", }, ], }; } const modelList = models .map((m) => `- ${m.name} (${(m.size / 1024 / 1024 / 1024).toFixed(2)} GB)`) .join("\n"); return { content: [ { type: "text", text: `Доступные модели Ollama:\n${modelList}`, }, ], }; } catch (error) { if (error instanceof TypeError && error.message.includes("fetch")) { throw new Error( `Не удалось подключиться к Ollama серверу (${OLLAMA_API_URL}). Убедитесь, что Ollama запущен: ollama serve` ); } throw new Error( `Ошибка получения списка моделей: ${error instanceof Error ? error.message : String(error)}` ); } }
  • Python handler function that fetches the list of available Ollama models from the /api/tags endpoint using requests, formats them with sizes, and returns a formatted string. Handles connection and other errors.
    def ollama_list_models() -> str: """Gets list of Ollama models""" try: response = requests.get(f"{OLLAMA_API_URL}/api/tags", timeout=10) response.raise_for_status() data = response.json() models = data.get("models", []) if not models: return "No available models. Load a model: ollama pull llama3.2" model_list = "\n".join( [ f"- {model['name']} ({(model.get('size', 0) / 1024 / 1024 / 1024):.2f} GB)" for model in models ] ) return f"Available Ollama models:\n{model_list}" except requests.exceptions.ConnectionError: raise Exception( f"Failed to connect to Ollama server ({OLLAMA_API_URL}). " "Make sure Ollama is running: ollama serve" ) except Exception as e: raise Exception(f"Error getting list of models: {str(e)}")
  • Input schema definition for the ollama_list_models tool in the tools list, indicating no input parameters required.
    name: "ollama_list_models", description: "Получает список доступных моделей Ollama", inputSchema: { type: "object", properties: {}, }, },
  • Input schema definition for the ollama_list_models tool in the get_tools() list, indicating no input parameters required.
    "name": "ollama_list_models", "description": "Gets list of available Ollama models", "inputSchema": { "type": "object", "properties": {}, },
  • src/index.ts:335-336 (registration)
    Dispatch/registration in the switch statement for handling tool calls to ollama_list_models by invoking the handler method.
    case "ollama_list_models": return await this.ollamaListModels();

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/TrueOleg/MCP-expirements'

If you have feedback or need assistance with the MCP directory API, please join our Discord server