Skip to main content
Glama

list_models

Retrieve available AI models from a ComfyUI server to select appropriate resources for image generation workflows. Filter results by model type or search terms.

Instructions

List models exposed by the configured ComfyUI server using the object info endpoints. Specify a kind to get a flat list or use recursive mode to aggregate multiple kinds.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
kindNo
recursiveNo
searchNo

Implementation Reference

  • Registration of the 'list_models' tool using @server.tool decorator.
    @server.tool( name="list_models", description=( "List models exposed by the configured ComfyUI server using the object" " info endpoints. Specify a kind to get a flat list or use recursive" " mode to aggregate multiple kinds." ), )
  • The MCP tool handler for 'list_models', which invokes the model client and logs via context.
    async def list_models( kind: str | None = None, recursive: bool = False, search: str | None = None, context: Context | None = None, ) -> dict[str, Any] | list[str]: """List models available on the remote ComfyUI instance.""" result = await model_client.list_models(kind=kind, recursive=recursive, search=search) if context is not None: if isinstance(result, dict): counts = result.get("counts", {}) await context.info( "Model inventory retrieved", data={"kinds": list(counts.keys()), "counts": counts}, ) else: await context.info( "Model inventory retrieved", data={"kind": kind, "count": len(result)}, ) return result
  • Core logic in ComfyUIModelClient.list_models that queries ComfyUI API object_info for model choices across kinds, applies filtering, and formats the response.
    async def list_models( self, *, kind: str | None = None, recursive: bool = False, search: str | None = None, ) -> dict[str, Any] | list[str]: """Fetch model information from the ComfyUI object info endpoints.""" if self._client is None: raise RuntimeError("Model client not initialised") kinds = [kind] if kind else list(MODEL_KIND_MAP.keys()) if recursive: kinds = list(dict.fromkeys(kinds)) search_lower = search.lower() if search else None inventory: dict[str, list[str]] = {} errors: dict[str, str] = {} for model_kind in kinds: if model_kind not in MODEL_KIND_MAP: raise ValueError(f"Unsupported model kind: {model_kind}") try: choices = await self._choices_for_kind(model_kind) except Exception as exc: # pragma: no cover - defensive logging path inventory[model_kind] = [] errors[model_kind] = str(exc) continue if search_lower: choices = [item for item in choices if search_lower in item.lower()] inventory[model_kind] = sorted(dict.fromkeys(choices)) summary: dict[str, Any] = { "base_url": self.api_base_url, "kinds": kinds, "counts": {kind: len(items) for kind, items in inventory.items()}, "retrieved_at": datetime.now(tz=timezone.utc).isoformat(), "models": inventory, } if errors: summary["errors"] = errors if kind and not recursive: return inventory.get(kind, []) return summary
  • Helper method to fetch model choices for a specific kind by querying the ComfyUI node object_info endpoint.
    async def _choices_for_kind(self, kind: str) -> list[str]: client = self._client if client is None: raise RuntimeError("Model client not initialised") node_class, input_name = MODEL_KIND_MAP[kind] url = f"{self.api_base_url}/object_info/{node_class}" response = await client.get(url) response.raise_for_status() payload = response.json() input_block = payload.get("input", {}) or {} field = ( input_block.get("required", {}).get(input_name) or input_block.get("properties", {}).get(input_name) or {} ) choices = field.get("choices") or field.get("items") or [] if isinstance(choices, dict) and "enum" in choices: choices = choices["enum"] return [item for item in choices if isinstance(item, str)]
  • Mapping from user-facing model kinds to ComfyUI loader node class and input field names.
    MODEL_KIND_MAP: dict[str, tuple[str, str]] = { "checkpoints": ("CheckpointLoaderSimple", "ckpt_name"), "loras": ("LoraLoader", "lora_name"), "vae": ("VAELoader", "vae_name"), "clip": ("CLIPLoader", "clip_name"), "controlnet": ("ControlNetLoader", "control_net_name"), }
Install Server

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/neutrinotek/ComfyUI_MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server