Skip to main content
Glama
Atomic-Germ

MCP Ollama Consult Server

list_ollama_models

Retrieve available Ollama models from your local instance to select models for AI consultation and reasoning tasks.

Instructions

List all available Ollama models on the local instance.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The main handler logic for 'list_ollama_models' tool. Retrieves available models using ModelValidator, handles empty list and errors, returns structured JSON response with model list, count, and note.
    case 'list_ollama_models': { try { const available = await this.modelValidator.getAvailableModels(); if (available.length === 0) { return { content: [ { type: 'text', text: 'No models available. Please install a model locally or use a cloud-based model (e.g., qwen2.5-coder:7b-cloud)', }, ], isError: true, }; } const modelNames = available.map((m) => m.name); return { content: [ { type: 'text', text: JSON.stringify( { models: modelNames, count: modelNames.length, note: 'These are available models (installed locally or cloud-based)', }, null, 2 ), }, ], }; } catch (error) { const message = error instanceof Error ? error.message : 'Failed to list models'; return { content: [ { type: 'text', text: `Error: ${message}`, }, ], isError: true, }; } }
  • Core utility function that queries Ollama's /api/tags endpoint, filters safe models (cloud or installed locals), and returns structured AvailableModel array used by the handler.
    async getAvailableModels(): Promise<AvailableModel[]> { try { const url = this.config.getApiUrl('/api/tags'); const response = await fetch(url, { method: 'GET', signal: AbortSignal.timeout(this.config.getTimeout()), }); if (!response.ok) { throw new OllamaError( `Failed to fetch models: ${response.statusText}`, 'LIST_MODELS_FAILED' ); } const text = await response.text(); const data = JSON.parse(text) as { models?: any[] }; const models = data.models || []; const available = models .filter((m) => this.modelIsSafe(m)) .map((m) => ({ name: m.name, installed: !this.looksLikeCloudModel(m.name), isCloud: this.looksLikeCloudModel(m.name), })); return available; } catch (error) { if (error instanceof OllamaError) throw error; throw new OllamaError( `Failed to fetch available models: ${error instanceof Error ? error.message : 'Unknown error'}`, 'CONNECTION_FAILED' ); } }
  • Tool registration entry in ListToolsHandler's listTools method, including name, description, and empty input schema (no arguments required).
    { name: 'list_ollama_models', description: 'List all available Ollama models on the local system (installed or cloud-based)', inputSchema: { type: 'object', properties: {}, required: [], }, },
  • Input schema definition for list_ollama_models: empty object, no required properties.
    type: 'object', properties: {}, required: [], }, },
  • Legacy handler implementation in src/handlers.ts (imported as ./handlers.js in src/index.ts), simple axios call to /api/tags and comma-separated model list.
    case "list_ollama_models": { try { const response = await axios.get(`${OLLAMA_BASE_URL}/api/tags`); const models = (response.data.models || []).map((m: any) => m.name).join(", "); return { content: [ { type: "text", text: `Available models: ${models}`, }, ], }; } catch (error) { const message = error instanceof Error ? error.message : "Unknown error"; return { content: [ { type: "text", text: `Error listing models: ${message}`, }, ], isError: true, }; } }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Atomic-Germ/mcp-consult'

If you have feedback or need assistance with the MCP directory API, please join our Discord server