list-available-models
Identify and display all Ollama models accessible for querying within the Multi-Model Advisor, enabling users to select appropriate AI models for diverse insights.
Instructions
List all available models in Ollama that can be used with query-models
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Implementation Reference
- src/index.ts:71-127 (handler)Handler function that lists available Ollama models by fetching from /api/tags, formats model details (name, size, params, quantization), checks default models availability, and returns markdown-formatted text response or error.async () => { try { const response = await fetch(`${OLLAMA_API_URL}/api/tags`); if (!response.ok) { throw new Error(`HTTP error! status: ${response.status}`); } const data = await response.json() as { models: OllamaModel[] }; if (!data.models || !Array.isArray(data.models)) { return { content: [ { type: "text", text: "No models found or unexpected response format from Ollama API." } ] }; } // Format model information const modelInfo = data.models.map(model => { const size = (model.size / (1024 * 1024 * 1024)).toFixed(2); // Convert to GB const paramSize = model.details?.parameter_size || "Unknown"; const quantLevel = model.details?.quantization_level || "Unknown"; return `- **${model.name}**: ${paramSize} parameters, ${size} GB, ${quantLevel} quantization`; }).join("\n"); // Show which models are currently configured as defaults const defaultModelsInfo = DEFAULT_MODELS.map(model => { const isAvailable = data.models.some(m => m.name === model); return `- **${model}**: ${isAvailable ? "✓ Available" : "⚠️ Not available"}`; }).join("\n"); return { content: [ { type: "text", text: `# Available Ollama Models\n\n${modelInfo}\n\n## Current Default Models\n\n${defaultModelsInfo}\n\nYou can use any of the available models with the query-models tool by specifying them in the 'models' parameter.` } ] }; } catch (error) { console.error("Error listing models:", error); return { isError: true, content: [ { type: "text", text: `Error listing models: ${error instanceof Error ? error.message : String(error)}\n\nMake sure Ollama is running and accessible at ${OLLAMA_API_URL}.` } ] }; } }
- src/index.ts:67-128 (registration)Registers the 'list-available-models' tool with McpServer.tool(), providing name, description, empty input schema, and the handler function.server.tool( "list-available-models", "List all available models in Ollama that can be used with query-models", {}, async () => { try { const response = await fetch(`${OLLAMA_API_URL}/api/tags`); if (!response.ok) { throw new Error(`HTTP error! status: ${response.status}`); } const data = await response.json() as { models: OllamaModel[] }; if (!data.models || !Array.isArray(data.models)) { return { content: [ { type: "text", text: "No models found or unexpected response format from Ollama API." } ] }; } // Format model information const modelInfo = data.models.map(model => { const size = (model.size / (1024 * 1024 * 1024)).toFixed(2); // Convert to GB const paramSize = model.details?.parameter_size || "Unknown"; const quantLevel = model.details?.quantization_level || "Unknown"; return `- **${model.name}**: ${paramSize} parameters, ${size} GB, ${quantLevel} quantization`; }).join("\n"); // Show which models are currently configured as defaults const defaultModelsInfo = DEFAULT_MODELS.map(model => { const isAvailable = data.models.some(m => m.name === model); return `- **${model}**: ${isAvailable ? "✓ Available" : "⚠️ Not available"}`; }).join("\n"); return { content: [ { type: "text", text: `# Available Ollama Models\n\n${modelInfo}\n\n## Current Default Models\n\n${defaultModelsInfo}\n\nYou can use any of the available models with the query-models tool by specifying them in the 'models' parameter.` } ] }; } catch (error) { console.error("Error listing models:", error); return { isError: true, content: [ { type: "text", text: `Error listing models: ${error instanceof Error ? error.message : String(error)}\n\nMake sure Ollama is running and accessible at ${OLLAMA_API_URL}.` } ] }; } } );