Skip to main content
Glama

list_providers

Retrieve available LLM providers configured in the ThinkingCap MCP server to enable parallel processing of complex research queries.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The asynchronous handler function for the 'list_providers' tool that returns a formatted list of available providers and currently configured agents.
    async () => { const providerList = Object.entries(PROVIDERS) .map( ([key, config]) => `- ${key}: ${config.name} (env: ${config.envKey}, default model: ${config.defaultModel})` ) .join("\n"); const currentAgents = agentSpecs.join(", "); return { content: [ { type: "text" as const, text: `Available Providers:\n${providerList}\n\nCurrently configured agents: ${currentAgents}\n\nTo change agents, restart the server with different arguments:\nnpx not-enough-kimis provider1:model1 provider2:model2 ...`, }, ], }; }
  • src/index.ts:87-109 (registration)
    Registers the 'list_providers' tool with the MCP server using server.tool, with no input schema.
    server.tool( "list_providers", {}, async () => { const providerList = Object.entries(PROVIDERS) .map( ([key, config]) => `- ${key}: ${config.name} (env: ${config.envKey}, default model: ${config.defaultModel})` ) .join("\n"); const currentAgents = agentSpecs.join(", "); return { content: [ { type: "text" as const, text: `Available Providers:\n${providerList}\n\nCurrently configured agents: ${currentAgents}\n\nTo change agents, restart the server with different arguments:\nnpx not-enough-kimis provider1:model1 provider2:model2 ...`, }, ], }; } );
  • The PROVIDERS constant listing all supported LLM providers, used by the list_providers handler to generate the provider list.
    export const PROVIDERS: Record<string, ProviderConfig> = { openai: { name: "OpenAI", baseUrl: "https://api.openai.com/v1", envKey: "OPENAI_API_KEY", isOpenAICompatible: true, defaultModel: "gpt-5.1", }, openrouter: { name: "OpenRouter", baseUrl: "https://openrouter.ai/api/v1", envKey: "OPENROUTER_API_KEY", isOpenAICompatible: true, defaultModel: "moonshotai/kimi-k2-thinking", }, groq: { name: "Groq", baseUrl: "https://api.groq.com/openai/v1", envKey: "GROQ_API_KEY", isOpenAICompatible: true, defaultModel: "moonshotai/kimi-k2-instruct-0905", }, cerebras: { name: "Cerebras", baseUrl: "https://api.cerebras.ai/v1", envKey: "CEREBRAS_API_KEY", isOpenAICompatible: true, defaultModel: "zai-glm-4.6", }, xai: { name: "xAI", baseUrl: "https://api.x.ai/v1", envKey: "XAI_API_KEY", isOpenAICompatible: true, defaultModel: "grok-4-fast", }, anthropic: { name: "Anthropic", baseUrl: "https://api.anthropic.com/v1", envKey: "ANTHROPIC_API_KEY", isOpenAICompatible: false, defaultModel: "claude-opus-4-5", }, google: { name: "Google", baseUrl: "https://generativelanguage.googleapis.com/v1beta", envKey: "GOOGLE_API_KEY", isOpenAICompatible: false, defaultModel: "gemini-3-pro-preview", }, };

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Infatoshi/thinkingcap'

If you have feedback or need assistance with the MCP directory API, please join our Discord server