ollama_list_models
Retrieve available Ollama models to identify and select AI models for use with the MCP Mac Apps Server, which controls macOS applications through natural language commands.
Instructions
Получает список доступных моделей Ollama
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- src/index.ts:561-613 (handler)Handler function that lists available Ollama models by querying the /api/tags endpoint, formatting model names and sizes, and handling connection errors.private async ollamaListModels() { try { const response = await fetch(`${OLLAMA_API_URL}/api/tags`, { method: "GET", headers: { "Content-Type": "application/json", }, }); if (!response.ok) { const errorText = await response.text(); throw new Error( `Ollama API error: ${response.status} ${errorText}` ); } const data = (await response.json()) as { models?: Array<{ name: string; size: number }> }; const models = data.models || []; if (models.length === 0) { return { content: [ { type: "text", text: "Нет доступных моделей. Загрузите модель: ollama pull llama3.2", }, ], }; } const modelList = models .map((m) => `- ${m.name} (${(m.size / 1024 / 1024 / 1024).toFixed(2)} GB)`) .join("\n"); return { content: [ { type: "text", text: `Доступные модели Ollama:\n${modelList}`, }, ], }; } catch (error) { if (error instanceof TypeError && error.message.includes("fetch")) { throw new Error( `Не удалось подключиться к Ollama серверу (${OLLAMA_API_URL}). Убедитесь, что Ollama запущен: ollama serve` ); } throw new Error( `Ошибка получения списка моделей: ${error instanceof Error ? error.message : String(error)}` ); } }
- src/server.py:387-412 (handler)Handler function that lists available Ollama models by querying the /api/tags endpoint using requests, formatting model names and sizes, and handling errors.def ollama_list_models() -> str: """Gets list of Ollama models""" try: response = requests.get(f"{OLLAMA_API_URL}/api/tags", timeout=10) response.raise_for_status() data = response.json() models = data.get("models", []) if not models: return "No available models. Load a model: ollama pull llama3.2" model_list = "\n".join( [ f"- {model['name']} ({(model.get('size', 0) / 1024 / 1024 / 1024):.2f} GB)" for model in models ] ) return f"Available Ollama models:\n{model_list}" except requests.exceptions.ConnectionError: raise Exception( f"Failed to connect to Ollama server ({OLLAMA_API_URL}). " "Make sure Ollama is running: ollama serve" ) except Exception as e: raise Exception(f"Error getting list of models: {str(e)}")
- src/index.ts:149-156 (schema)Tool schema definition including name, description, and empty input schema (no parameters required).{ name: "ollama_list_models", description: "Получает список доступных моделей Ollama", inputSchema: { type: "object", properties: {}, }, },
- src/server.py:136-143 (schema)Tool schema definition including name, description, and empty input schema (no parameters required).{ "name": "ollama_list_models", "description": "Gets list of available Ollama models", "inputSchema": { "type": "object", "properties": {}, }, },
- src/index.ts:335-337 (registration)Registration in the tool dispatch switch statement mapping tool name to handler call.case "ollama_list_models": return await this.ollamaListModels();