Skip to main content
Glama

list_models

Retrieve available AI models from a ComfyUI server to identify compatible options for image generation workflows. Filter results by type or search terms to find specific models.

Instructions

List models exposed by the configured ComfyUI server using the object info endpoints. Specify a kind to get a flat list or use recursive mode to aggregate multiple kinds.

Input Schema

TableJSON Schema
NameRequiredDescriptionDefault
kindNo
recursiveNo
searchNo

Implementation Reference

  • FastMCP tool handler for 'list_models', delegating to ComfyUIModelClient.list_models and logging via context.
    async def list_models( kind: str | None = None, recursive: bool = False, search: str | None = None, context: Context | None = None, ) -> dict[str, Any] | list[str]: """List models available on the remote ComfyUI instance.""" result = await model_client.list_models(kind=kind, recursive=recursive, search=search) if context is not None: if isinstance(result, dict): counts = result.get("counts", {}) await context.info( "Model inventory retrieved", data={"kinds": list(counts.keys()), "counts": counts}, ) else: await context.info( "Model inventory retrieved", data={"kind": kind, "count": len(result)}, ) return result
  • Registration of the 'list_models' tool decorator on the FastMCP server.
    @server.tool( name="list_models", description=( "List models exposed by the configured ComfyUI server using the object" " info endpoints. Specify a kind to get a flat list or use recursive" " mode to aggregate multiple kinds." ), )
  • Core implementation in ComfyUIModelClient that queries ComfyUI object_info endpoints to list models by kind, supports filtering and recursion.
    async def list_models( self, *, kind: str | None = None, recursive: bool = False, search: str | None = None, ) -> dict[str, Any] | list[str]: """Fetch model information from the ComfyUI object info endpoints.""" if self._client is None: raise RuntimeError("Model client not initialised") kinds = [kind] if kind else list(MODEL_KIND_MAP.keys()) if recursive: kinds = list(dict.fromkeys(kinds)) search_lower = search.lower() if search else None inventory: dict[str, list[str]] = {} errors: dict[str, str] = {} for model_kind in kinds: if model_kind not in MODEL_KIND_MAP: raise ValueError(f"Unsupported model kind: {model_kind}") try: choices = await self._choices_for_kind(model_kind) except Exception as exc: # pragma: no cover - defensive logging path inventory[model_kind] = [] errors[model_kind] = str(exc) continue if search_lower: choices = [item for item in choices if search_lower in item.lower()] inventory[model_kind] = sorted(dict.fromkeys(choices)) summary: dict[str, Any] = { "base_url": self.api_base_url, "kinds": kinds, "counts": {kind: len(items) for kind, items in inventory.items()}, "retrieved_at": datetime.now(tz=timezone.utc).isoformat(), "models": inventory, } if errors: summary["errors"] = errors if kind and not recursive: return inventory.get(kind, []) return summary
  • Mapping of supported model kinds to corresponding ComfyUI node classes and input field names used in model listing.
    MODEL_KIND_MAP: dict[str, tuple[str, str]] = { "checkpoints": ("CheckpointLoaderSimple", "ckpt_name"), "loras": ("LoraLoader", "lora_name"), "vae": ("VAELoader", "vae_name"), "clip": ("CLIPLoader", "clip_name"), "controlnet": ("ControlNetLoader", "control_net_name"), }

Other Tools

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/neutrinotek/ComfyUI_MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server