Skip to main content
Glama

list_models

Retrieve available models by type (ollama, openai, huggingface, ggml) for use as attack or target models in Garak-MCP's LLM vulnerability scanner.

Instructions

List all available models for a given model type. Those models can be used for the attack and target models. Args: model_type (str): The type of model to list (ollama, openai, huggingface, ggml) Returns: list[str]: A list of available models.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
model_typeYes

Implementation Reference

  • MCP tool handler for 'list_models'. Decorated with @mcp.tool() to register and execute the tool logic by calling ModelConfig.list_models.
    @mcp.tool() def list_models(model_type: str) -> list[str]: """ List all available models for a given model type. Those models can be used for the attack and target models. Args: model_type (str): The type of model to list (ollama, openai, huggingface, ggml) Returns: list[str]: A list of available models. """ return GarakServer().config.list_models(model_type)
  • src/server.py:118-118 (registration)
    Registration of the 'list_models' tool using FastMCP's @mcp.tool() decorator.
    @mcp.tool()
  • Helper method in ModelConfig that implements the core logic for listing models by delegating to type-specific model getters.
    def list_models(self, model_type: str) -> List[str]: """ List available models for a given model type. Args: model_type (str): The type of model (ollama, openai, huggingface, ggml) Returns: List[str]: List of available model names """ if model_type not in self.model_types: raise ValueError(f"Invalid model type: {model_type}") return self.model_types[model_type]["models"]()
  • Helper function to fetch Ollama models via API.
    def _get_ollama_models(self) -> List[str]: """Get list of installed Ollama models""" try: response = requests.get('http://localhost:11434/api/tags') response.raise_for_status() data = response.json() return [model['name'] for model in data.get('models', [])] except requests.exceptions.RequestException as e: print(f"Error fetching Ollama models: {e}") return []

Other Tools

Related Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/EdenYavin/Garak-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server