get_model_info
Retrieve detailed information about AI models from multiple providers to understand capabilities, specifications, and integration requirements before implementation.
Instructions
Get information about a specific model.
Args:
model: Model name to get info for
Returns:
Dictionary with model information
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| model | Yes |
Implementation Reference
- src/mcp_ai_hub/server.py:261-277 (handler)MCP tool handler for get_model_info: decorated with @mcp.tool() for registration, validates AI client initialization, and delegates to AIClient.get_model_info(model).@mcp.tool() async def get_model_info(model: str) -> dict[str, Any]: """Get information about a specific model. Args: model: Model name to get info for Returns: Dictionary with model information """ global ai_client if ai_client is None: raise RuntimeError("AI client not initialized") return ai_client.get_model_info(model)
- src/mcp_ai_hub/ai_client.py:273-285 (helper)Core logic for retrieving model information from configuration, returning details like model name, provider model, parameters, and system prompts.def get_model_info(self, model_name: str) -> dict[str, Any]: """Get information about a specific model.""" model_config = self.config.get_model_config(model_name) if not model_config: raise ValueError(f"Model '{model_name}' not found in configuration.") return { "model_name": model_config.model_name, "provider_model": model_config.litellm_params.get("model"), "configured_params": list(model_config.litellm_params.keys()), "system_prompt": model_config.system_prompt, "global_system_prompt": self.config.global_system_prompt, }