Comfy MCP Server
A server using FastMCP framework to generate images based on prompts via a remote Comfy server.
Overview
This script sets up a server using the FastMCP framework to generate images based on prompts using a specified workflow. It interacts with a remote Comfy server to submit prompts and retrieve generated images.
Prerequisites
uv package and project manager for Python.
Workflow file exported from Comfy UI. This code includes a sample
Flux-Dev-ComfyUI-Workflow.jsonwhich is only used here as reference. You will need to export from your workflow and set the environment variables accordingly.
You can install the required packages for local development:
Configuration
Set the following environment variables:
COMFY_URLto point to your Comfy server URL.COMFY_WORKFLOW_JSON_FILEto point to the absolute path of the API export json file for the comfyui workflow.PROMPT_NODE_IDto the id of the text prompt node.OUTPUT_NODE_IDto the id of the output node with the final image.OUTPUT_MODEto eitherurlorfileto select desired output.
Optionally, if you have an Ollama server running, you can connect to it for prompt generation.
OLLAMA_API_BASEto the url where ollama is running.PROMPT_LLMto the name of the model hosted on ollama for prompt generation.
Example:
Usage
Comfy MCP Server can be launched by the following command:
Example Claude Desktop Config
Functionality
generate_image(prompt: str, ctx: Context) -> Image | str
This function generates an image using a specified prompt. It follows these steps:
Checks if all the environment variable are set.
Loads a prompt template from a JSON file.
Submits the prompt to the Comfy server.
Polls the server for the status of the prompt processing.
Retrieves and returns the generated image once it's ready.
generate_prompt(topic: str, ctx: Context) -> str
This function generates a comprehensive image generation prompt from specified topic.
Dependencies
mcp: For setting up the FastMCP server.json: For handling JSON data.urllib: For making HTTP requests.time: For adding delays in polling.os: For accessing environment variables.langchain: For creating simple LLM Prompt chain to generate image generation prompt from topic.langchain-ollama: For ollama specific modules for LangChain.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related MCP Servers
- AsecurityFlicenseAqualityA FastMCP server implementation that facilitates resource-based access to AI model inference, focusing on image generation through the Replicate API, with features like real-time updates, webhook integration, and secure API key management.Last updated -16
- AsecurityAlicenseAqualityAn MCP Server that integrates with Stability AI's API to provide high-quality image generation, editing, and manipulation capabilities including background removal, outpainting, search-and-replace, and upscaling.Last updated -5280MIT License
- AsecurityFlicenseAqualityAn MCP server that generates images based on text prompts using Black Forest Lab's FLUX model, allowing for customized image dimensions, prompt upsampling, safety settings, and batch generation.Last updated -31
- AsecurityFlicenseAqualityAn MCP image generation server based on the Flux Schnell model that provides API access for generating images from text prompts with customizable dimensions and seeds.Last updated -11
Appeared in Searches
- Instructions for accessing Gmail messages programmatically using a cursor
- Exploring text-to-image generation techniques
- A flexible and user-friendly UI tool for extended functionalities
- A platform for finding and downloading stock images using keywords
- A platform for downloading images from Unsplash using specified keywords