Skip to main content
Glama

get_model_info

Retrieve detailed specifications and configuration data for a specific vLLM model to understand its capabilities and requirements.

Instructions

Get detailed information about a specific model

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
model_idYesThe ID of the model to get info for

Implementation Reference

  • Main handler function get_model_info that validates model_id argument, calls VLLMClient to fetch model information, and formats the response as TextContent with JSON-formatted model details.
    async def get_model_info(arguments: dict[str, Any]) -> list[TextContent]:
        """
        Get detailed information about a specific model.
    
        Args:
            arguments: Dictionary containing:
                - model_id: The ID of the model to get info for
    
        Returns:
            List of TextContent with detailed model information.
        """
        model_id = arguments.get("model_id")
        if not model_id:
            return [TextContent(type="text", text="Error: No model_id provided")]
    
        try:
            async with VLLMClient() as client:
                model_info = await client.get_model_info(model_id)
    
                if not model_info:
                    return [
                        TextContent(type="text", text=f"Model '{model_id}' not found on the server.")
                    ]
    
                # Format model info
                result = f"## Model: {model_id}\n\n"
                result += "```json\n"
                result += json.dumps(model_info, indent=2)
                result += "\n```"
    
                return [TextContent(type="text", text=result)]
    
        except VLLMClientError as e:
            return [TextContent(type="text", text=f"Error getting model info: {str(e)}")]
  • VLLMClient.get_model_info helper method that retrieves model information by listing all models and finding the one matching the given model_id.
    async def get_model_info(self, model_id: str) -> Optional[dict[str, Any]]:
        """Get information about a specific model."""
        models = await self.list_models()
        for model in models:
            if model.get("id") == model_id:
                return model
        return None
  • Tool registration defining get_model_info with its name, description, and inputSchema that requires a model_id string parameter.
    Tool(
        name="get_model_info",
        description="Get detailed information about a specific model",
        inputSchema={
            "type": "object",
            "properties": {
                "model_id": {
                    "type": "string",
                    "description": "The ID of the model to get info for",
                },
            },
            "required": ["model_id"],
        },
  • Handler mapping in call_tool function that routes 'get_model_info' tool calls to the get_model_info handler function.
    elif name == "get_model_info":
        return await get_model_info(arguments)

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/micytao/vllm-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server