Skip to main content
Glama

llm_get_models

Retrieves available language models from OpenAI-compatible LLM servers to identify options for testing, benchmarking, and chat operations.

Instructions

Obtiene la lista de modelos disponibles en el servidor LLM (compatible con OpenAI API: LM Studio, Ollama, vLLM, OpenAI, etc.)

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
baseURLNoURL del servidor OpenAI-compatible (ej: http://localhost:1234/v1, http://localhost:11434/v1)
apiKeyNoAPI Key (requerida para OpenAI/Azure, opcional para servidores locales)

Implementation Reference

  • The core handler function for the 'llm_get_models' tool. It creates an LLMClient instance, lists available models from the server, and returns a JSON-formatted response with model IDs, owners, count, and baseURL.
    async llm_get_models(args: z.infer<typeof GetModelsSchema> = {}) { const client = getClient(args); const models = await client.listModels(); return { content: [ { type: "text" as const, text: JSON.stringify({ models: models.map(m => ({ id: m.id, owned_by: m.owned_by, })), count: models.length, baseURL: args.baseURL || defaultConfig.baseURL, }, null, 2), }, ], }; },
  • MCP tool registration entry defining the name, description, and input schema (connection properties like baseURL and apiKey) for 'llm_get_models'.
    { name: "llm_get_models", description: "Obtiene la lista de modelos disponibles en el servidor LLM (compatible con OpenAI API: LM Studio, Ollama, vLLM, OpenAI, etc.)", inputSchema: { type: "object" as const, properties: { ...connectionProperties, }, required: [], }, },
  • Zod schema for input validation of llm_get_models arguments, extending the base ConnectionConfigSchema.
    export const GetModelsSchema = ConnectionConfigSchema.extend({});
  • src/index.ts:52-53 (registration)
    Registration in the MCP server's CallToolRequest handler that dispatches to the llm_get_models tool handler based on the tool name.
    case "llm_get_models": return await toolHandlers.llm_get_models(args as any);
  • src/index.ts:42-44 (registration)
    MCP server handler for listing tools, which returns the tools array including 'llm_get_models'.
    server.setRequestHandler(ListToolsRequestSchema, async () => { return { tools }; });

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/ramgeart/llm-mcp-bridge'

If you have feedback or need assistance with the MCP directory API, please join our Discord server