Skip to main content
Glama

get_model_info

Retrieve detailed information about AI models from multiple providers to understand capabilities, specifications, and integration requirements before implementation.

Instructions

Get information about a specific model.

Args: model: Model name to get info for Returns: Dictionary with model information

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
modelYes

Implementation Reference

  • MCP tool handler for get_model_info: decorated with @mcp.tool() for registration, validates AI client initialization, and delegates to AIClient.get_model_info(model).
    @mcp.tool() async def get_model_info(model: str) -> dict[str, Any]: """Get information about a specific model. Args: model: Model name to get info for Returns: Dictionary with model information """ global ai_client if ai_client is None: raise RuntimeError("AI client not initialized") return ai_client.get_model_info(model)
  • Core logic for retrieving model information from configuration, returning details like model name, provider model, parameters, and system prompts.
    def get_model_info(self, model_name: str) -> dict[str, Any]: """Get information about a specific model.""" model_config = self.config.get_model_config(model_name) if not model_config: raise ValueError(f"Model '{model_name}' not found in configuration.") return { "model_name": model_config.model_name, "provider_model": model_config.litellm_params.get("model"), "configured_params": list(model_config.litellm_params.keys()), "system_prompt": model_config.system_prompt, "global_system_prompt": self.config.global_system_prompt, }

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/feiskyer/mcp-ai-hub'

If you have feedback or need assistance with the MCP directory API, please join our Discord server