Skip to main content
Glama

list_available_providers

Discover available AI model providers and their configurations to select the right multimodal models for processing images and media formats.

Instructions

List available model providers and their configurations.

Returns: JSON string of available providers and their models

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault

No arguments

Implementation Reference

  • The handler function for the 'list_available_providers' tool. It is decorated with @self.server.tool() for registration and returns a JSON string containing information about available providers (e.g., openai, dashscope), including their supported models, default models, max_tokens, and temperature settings.
    def list_available_providers() -> str: """List available model providers and their configurations. Returns: JSON string of available providers and their models """ providers_info = {} for provider_name, provider in self.providers.items(): # Find the provider config to get default model provider_config = None for config in self.config.get("providers", []): if config.get("provider_type") == provider_name: provider_config = config break if isinstance(provider, OpenAIProvider): providers_info[provider_name] = { "type": "openai", "default_model": provider_config.get("default_model", "gpt-4o") if provider_config else "gpt-4o", "supported_models": provider.supported_models, "max_tokens": provider_config.get("max_tokens", 4000) if provider_config else 4000, "temperature": provider_config.get("temperature", 0.7) if provider_config else 0.7 } elif isinstance(provider, DashscopeProvider): providers_info[provider_name] = { "type": "dashscope", "default_model": provider_config.get("default_model", "qwen-vl-plus") if provider_config else "qwen-vl-plus", "supported_models": provider.supported_models, "max_tokens": provider_config.get("max_tokens", 4000) if provider_config else 4000, "temperature": provider_config.get("temperature", 0.7) if provider_config else 0.7 } return json.dumps(providers_info, indent=2)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/StanleyChanH/vllm-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server