Skip to main content
Glama

show_model

Retrieve detailed information about a specific AI model by providing its name. Use this tool to understand model configurations and capabilities on the MCP Ollama Server.

Instructions

Get detailed information about a specific model

Args: name: Name of the model to show information about

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
nameYes

Implementation Reference

  • The handler function for the 'show_model' tool. It takes a model name, uses ollama.show() to fetch details like license, format, parameter size, quantization, system prompt, and template, then formats and returns them.
    @mcp.tool() async def show_model(name: str) -> str: """Get detailed information about a specific model Args: name: Name of the model to show information about """ try: model_info = ollama.show(name) if not model_info: return f"No information found for model {name}" # Format the model information details = [ f"Model: {name}", f"License: {model_info.get('license', 'Unknown')}", f"Format: {model_info.get('format', 'Unknown')}", f"Parameter Size: {model_info.get('parameter_size', 'Unknown')}", f"Quantization Level: {model_info.get('quantization_level', 'Unknown')}" ] # Add system prompt if available if model_info.get('system'): details.append(f"\nSystem Prompt:\n{model_info['system']}") # Add template if available if model_info.get('template'): details.append(f"\nTemplate:\n{model_info['template']}") return "\n".join(details) except Exception as e: return f"Error getting model information: {str(e)}"
  • The @mcp.tool() decorator registers the show_model function as an MCP tool.
    @mcp.tool()
  • Function signature defines input 'name: str' and output 'str', with docstring describing the parameter.
    async def show_model(name: str) -> str: """Get detailed information about a specific model Args: name: Name of the model to show information about """

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/emgeee/mcp-ollama'

If you have feedback or need assistance with the MCP directory API, please join our Discord server