Skip to main content
Glama

list-available-models

Discover which AI models are accessible for querying through the Multi-Model Advisor to obtain diverse perspectives on your questions.

Instructions

List all available models in Ollama that can be used with query-models

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The handler function that fetches the list of available models from the Ollama API, formats the information including model sizes, parameter counts, quantization levels, and indicates which default models are available.
    try { const response = await fetch(`${OLLAMA_API_URL}/api/tags`); if (!response.ok) { throw new Error(`HTTP error! status: ${response.status}`); } const data = await response.json() as { models: OllamaModel[] }; if (!data.models || !Array.isArray(data.models)) { return { content: [ { type: "text", text: "No models found or unexpected response format from Ollama API." } ] }; } // Format model information const modelInfo = data.models.map(model => { const size = (model.size / (1024 * 1024 * 1024)).toFixed(2); // Convert to GB const paramSize = model.details?.parameter_size || "Unknown"; const quantLevel = model.details?.quantization_level || "Unknown"; return `- **${model.name}**: ${paramSize} parameters, ${size} GB, ${quantLevel} quantization`; }).join("\n"); // Show which models are currently configured as defaults const defaultModelsInfo = DEFAULT_MODELS.map(model => { const isAvailable = data.models.some(m => m.name === model); return `- **${model}**: ${isAvailable ? "✓ Available" : "⚠️ Not available"}`; }).join("\n"); return { content: [ { type: "text", text: `# Available Ollama Models\n\n${modelInfo}\n\n## Current Default Models\n\n${defaultModelsInfo}\n\nYou can use any of the available models with the query-models tool by specifying them in the 'models' parameter.` } ] }; } catch (error) { console.error("Error listing models:", error); return { isError: true, content: [ { type: "text", text: `Error listing models: ${error instanceof Error ? error.message : String(error)}\n\nMake sure Ollama is running and accessible at ${OLLAMA_API_URL}.` } ] }; } } );
  • src/index.ts:68-129 (registration)
    The registration of the 'list-available-models' tool using McpServer.tool method, specifying name, description, empty input schema, and inline handler function.
    "list-available-models", "List all available models in Ollama that can be used with query-models", {}, async () => { try { const response = await fetch(`${OLLAMA_API_URL}/api/tags`); if (!response.ok) { throw new Error(`HTTP error! status: ${response.status}`); } const data = await response.json() as { models: OllamaModel[] }; if (!data.models || !Array.isArray(data.models)) { return { content: [ { type: "text", text: "No models found or unexpected response format from Ollama API." } ] }; } // Format model information const modelInfo = data.models.map(model => { const size = (model.size / (1024 * 1024 * 1024)).toFixed(2); // Convert to GB const paramSize = model.details?.parameter_size || "Unknown"; const quantLevel = model.details?.quantization_level || "Unknown"; return `- **${model.name}**: ${paramSize} parameters, ${size} GB, ${quantLevel} quantization`; }).join("\n"); // Show which models are currently configured as defaults const defaultModelsInfo = DEFAULT_MODELS.map(model => { const isAvailable = data.models.some(m => m.name === model); return `- **${model}**: ${isAvailable ? "✓ Available" : "⚠️ Not available"}`; }).join("\n"); return { content: [ { type: "text", text: `# Available Ollama Models\n\n${modelInfo}\n\n## Current Default Models\n\n${defaultModelsInfo}\n\nYou can use any of the available models with the query-models tool by specifying them in the 'models' parameter.` } ] }; } catch (error) { console.error("Error listing models:", error); return { isError: true, content: [ { type: "text", text: `Error listing models: ${error instanceof Error ? error.message : String(error)}\n\nMake sure Ollama is running and accessible at ${OLLAMA_API_URL}.` } ] }; } } );
  • Empty input schema (Zod object) for the tool, indicating no parameters are required.
    async () => {
  • TypeScript interface defining the structure of an Ollama model object, used in the handler to type the API response and access model details.
    interface OllamaModel { name: string; modified_at: string; size: number; digest: string; details: { format: string; family: string; families: string[]; parameter_size: string; quantization_level: string; }; }
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/YuChenSSR/multi-ai-advisor-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server