Uses TOML configuration files to specify ComfyUI base URL, workflow paths, and asset directories for orchestrating ComfyUI workflows
ComfyUI_MCP
This repository provides a Model Context Protocol (MCP) server that exposes local ComfyUI workflow files and the remote ComfyUI model catalog to compatible LLM clients.
Features
Discovers workflow
.jsonfiles stored under theworkflows/directory and exposes them as MCP resources.Tools for listing workflow metadata and reading workflow contents.
Tool for recursively querying the ComfyUI
/api/modelsendpoints so an LLM can inspect the available checkpoints, LoRAs, and other assets.Configuration via CLI arguments or environment variables so the server can be launched from JSON descriptors (e.g., Cursor MCP definitions).
Installation
Create a virtual environment (recommended) and install the package in editable mode with your preferred installer:
The project also works with uv so you can install or run it without using pip directly:
Running the server
The installed comfyui-mcp entry point launches the MCP server over stdio. Common configuration options can be supplied either as CLI flags or environment variables:
Purpose | CLI flag | Environment variable | Default |
Workflow directory |
|
|
|
ComfyUI API base URL |
|
|
|
HTTP timeout (seconds) |
|
|
|
Log level |
|
|
|
Example stdio launch configuration for a Cursor MCP JSON definition:
If you prefer uvx, the same configuration can be expressed as:
When referencing the project locally with uvx, ensure the working directory is set to the
repository root (or adjust the --from path accordingly) so the package can be resolved without
requiring it to be published to an external index.
Place your ComfyUI workflow files in the workflows/ directory (or whatever directory you configure) so they are available to the LLM.