show_model
Retrieve detailed information about a specific AI model by providing its name. Use this tool to understand model configurations and capabilities on the MCP Ollama Server.
Instructions
Get detailed information about a specific model
Args:
name: Name of the model to show information about
Input Schema
TableJSON Schema
| Name | Required | Description | Default |
|---|---|---|---|
| name | Yes |
Implementation Reference
- src/mcp_ollama/server.py:40-71 (handler)The handler function for the 'show_model' tool. It takes a model name, uses ollama.show() to fetch details like license, format, parameter size, quantization, system prompt, and template, then formats and returns them.@mcp.tool() async def show_model(name: str) -> str: """Get detailed information about a specific model Args: name: Name of the model to show information about """ try: model_info = ollama.show(name) if not model_info: return f"No information found for model {name}" # Format the model information details = [ f"Model: {name}", f"License: {model_info.get('license', 'Unknown')}", f"Format: {model_info.get('format', 'Unknown')}", f"Parameter Size: {model_info.get('parameter_size', 'Unknown')}", f"Quantization Level: {model_info.get('quantization_level', 'Unknown')}" ] # Add system prompt if available if model_info.get('system'): details.append(f"\nSystem Prompt:\n{model_info['system']}") # Add template if available if model_info.get('template'): details.append(f"\nTemplate:\n{model_info['template']}") return "\n".join(details) except Exception as e: return f"Error getting model information: {str(e)}"
- src/mcp_ollama/server.py:40-40 (registration)The @mcp.tool() decorator registers the show_model function as an MCP tool.@mcp.tool()
- src/mcp_ollama/server.py:41-46 (schema)Function signature defines input 'name: str' and output 'str', with docstring describing the parameter.async def show_model(name: str) -> str: """Get detailed information about a specific model Args: name: Name of the model to show information about """