Skip to main content
Glama

list-ai-models

Retrieve and monitor all available AI models and their configuration status on the Ultra MCP server, enabling centralized access and management of AI providers.

Instructions

List all available AI models and their configuration status

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • Implementation of the list-ai-models tool handler in AIToolHandlers class. Lists available AI models from ProviderManager, checks configuration status, and formats a markdown response with defaults.
    async handleListModels() { const availableModels = this.providerManager.getAvailableModels(); const configuredProviders = await this.providerManager.getConfiguredProviders(); let response = "## Available AI Models\n\n"; for (const [provider, models] of Object.entries(availableModels)) { const isConfigured = configuredProviders.includes(provider); response += `### ${provider.charAt(0).toUpperCase() + provider.slice(1)} ${isConfigured ? "✅" : "❌"}\n`; if (models.length > 0) { response += models.map(model => `- ${model}`).join("\n"); } else { response += "- Not configured"; } response += "\n\n"; } response += `\n## Default Models\n`; response += `- OpenAI/Azure: gpt-5 (optimized for reasoning)\n`; response += `- Gemini: gemini-2.5-pro (with Google Search enabled)\n`; response += `- Grok: grok-4 (latest xAI model with reasoning support)\n`; return { content: [ { type: "text", text: response, }, ], }; }
  • src/server.ts:274-282 (registration)
    MCP server registration of the 'list-ai-models' tool, defining metadata and delegating execution to AIToolHandlers.handleListModels() via getHandlers()
    // Register list-ai-models tool server.registerTool("list-ai-models", { title: "List AI Models", description: "List all available AI models and their configuration status", inputSchema: {}, }, async () => { const aiHandlers = await getHandlers(); return await aiHandlers.handleListModels(); });
  • getHandlers() function that lazily initializes and returns the AIToolHandlers instance used by the list-ai-models tool, loading config and ProviderManager.
    async function getHandlers() { if (!handlers) { const { ConfigManager } = require("./config/manager"); const { ProviderManager } = require("./providers/manager"); const { AIToolHandlers } = require("./handlers/ai-tools"); const configManager = new ConfigManager(); // Load config and set environment variables const config = await configManager.getConfig(); if (config.openai?.apiKey) { process.env.OPENAI_API_KEY = config.openai.apiKey; } if (config.openai?.baseURL) { process.env.OPENAI_BASE_URL = config.openai.baseURL; } if (config.google?.apiKey) { process.env.GOOGLE_API_KEY = config.google.apiKey; } if (config.google?.baseURL) { process.env.GOOGLE_BASE_URL = config.google.baseURL; } if (config.azure?.apiKey) { process.env.AZURE_API_KEY = config.azure.apiKey; } if (config.azure?.baseURL) { process.env.AZURE_BASE_URL = config.azure.baseURL; } if (config.xai?.apiKey) { process.env.XAI_API_KEY = config.xai.apiKey; } if (config.xai?.baseURL) { process.env.XAI_BASE_URL = config.xai.baseURL; } providerManager = new ProviderManager(configManager); handlers = new AIToolHandlers(providerManager); } return handlers; }

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RealMikeChong/ultra-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server