Skip to main content
Glama

show_model

Retrieve detailed specifications and configuration information for a specific Ollama model to understand its capabilities and parameters before use.

Instructions

Get detailed information about a specific model

Args: name: Name of the model to show information about

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes

Implementation Reference

  • The 'show_model' tool handler: an async function decorated with @mcp.tool() that fetches and formats detailed model information (license, format, size, system prompt, template) using ollama.show(name).
    @mcp.tool() async def show_model(name: str) -> str: """Get detailed information about a specific model Args: name: Name of the model to show information about """ try: model_info = ollama.show(name) if not model_info: return f"No information found for model {name}" # Format the model information details = [ f"Model: {name}", f"License: {model_info.get('license', 'Unknown')}", f"Format: {model_info.get('format', 'Unknown')}", f"Parameter Size: {model_info.get('parameter_size', 'Unknown')}", f"Quantization Level: {model_info.get('quantization_level', 'Unknown')}" ] # Add system prompt if available if model_info.get('system'): details.append(f"\nSystem Prompt:\n{model_info['system']}") # Add template if available if model_info.get('template'): details.append(f"\nTemplate:\n{model_info['template']}") return "\n".join(details) except Exception as e: return f"Error getting model information: {str(e)}"

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server