Skip to main content
Glama

Enhanced Architecture MCP

model_list

Display available local AI models for selection and integration, enabling users to manage and utilize models efficiently within enhanced AI workflows.

Instructions

List available local AI models

Input Schema

NameRequiredDescriptionDefault

No arguments

Input Schema (JSON Schema)

{ "properties": {}, "required": [], "type": "object" }

Implementation Reference

  • Tool registration in ListToolsRequestHandler, defining name, description, and empty input schema for model_list.
    { name: 'model_list', description: 'List available local AI models', inputSchema: { type: 'object', properties: {}, required: [] } },
  • Dispatcher in CallToolRequestHandler that routes model_list calls to the getModelList handler.
    case 'model_list': return await this.getModelList();
  • Main handler function for model_list tool: fetches model list from Ollama /api/tags, formats with size and modified time using formatBytes, returns formatted JSON text response.
    async getModelList() { try { const response = await fetch(`${this.ollamaUrl}/api/tags`); if (!response.ok) { throw new Error(`Ollama API error: ${response.status}`); } const data = await response.json(); const modelList = data.models.map(model => ({ name: model.name, size: this.formatBytes(model.size), modified: model.modified_at })); return { content: [ { type: 'text', text: `Available Local Models:\n\n${JSON.stringify(modelList, null, 2)}` } ] }; } catch (error) { throw new Error(`Failed to get model list: ${error.message}`); } }
  • Helper function used in getModelList to format model file sizes in human-readable form (Bytes, KB, MB, etc.).
    formatBytes(bytes) { if (bytes === 0) return '0 Bytes'; const k = 1024; const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB']; const i = Math.floor(Math.log(bytes) / Math.log(k)); return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i]; }

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/autoexecbatman/arch-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server