model_list
Display available local AI models for selection and integration, enabling users to manage and utilize models efficiently within enhanced AI workflows.
Instructions
List available local AI models
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||
Input Schema (JSON Schema)
{
"properties": {},
"required": [],
"type": "object"
}
Implementation Reference
- local-ai-server.js:81-89 (registration)Tool registration in ListToolsRequestHandler, defining name, description, and empty input schema for model_list.{ name: 'model_list', description: 'List available local AI models', inputSchema: { type: 'object', properties: {}, required: [] } },
- local-ai-server.js:152-153 (registration)Dispatcher in CallToolRequestHandler that routes model_list calls to the getModelList handler.case 'model_list': return await this.getModelList();
- local-ai-server.js:233-259 (handler)Main handler function for model_list tool: fetches model list from Ollama /api/tags, formats with size and modified time using formatBytes, returns formatted JSON text response.async getModelList() { try { const response = await fetch(`${this.ollamaUrl}/api/tags`); if (!response.ok) { throw new Error(`Ollama API error: ${response.status}`); } const data = await response.json(); const modelList = data.models.map(model => ({ name: model.name, size: this.formatBytes(model.size), modified: model.modified_at })); return { content: [ { type: 'text', text: `Available Local Models:\n\n${JSON.stringify(modelList, null, 2)}` } ] }; } catch (error) { throw new Error(`Failed to get model list: ${error.message}`); } }
- local-ai-server.js:340-346 (helper)Helper function used in getModelList to format model file sizes in human-readable form (Bytes, KB, MB, etc.).formatBytes(bytes) { if (bytes === 0) return '0 Bytes'; const k = 1024; const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB']; const i = Math.floor(Math.log(bytes) / Math.log(k)); return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i]; }