Skip to main content
Glama

Comfy MCP Server

by lalanikarim

Comfy MCP Server

A server using FastMCP framework to generate images based on prompts via a remote Comfy server.

Overview

This script sets up a server using the FastMCP framework to generate images based on prompts using a specified workflow. It interacts with a remote Comfy server to submit prompts and retrieve generated images.

Prerequisites

  • uv package and project manager for Python.
  • Workflow file exported from Comfy UI. This code includes a sample Flux-Dev-ComfyUI-Workflow.json which is only used here as reference. You will need to export from your workflow and set the environment variables accordingly.

You can install the required packages for local development:

uvx mcp[cli]

Configuration

Set the following environment variables:

  • COMFY_URL to point to your Comfy server URL.
  • COMFY_WORKFLOW_JSON_FILE to point to the absolute path of the API export json file for the comfyui workflow.
  • PROMPT_NODE_ID to the id of the text prompt node.
  • OUTPUT_NODE_ID to the id of the output node with the final image.
  • OUTPUT_MODE to either url or file to select desired output.

Optionally, if you have an Ollama server running, you can connect to it for prompt generation.

  • OLLAMA_API_BASE to the url where ollama is running.
  • PROMPT_LLM to the name of the model hosted on ollama for prompt generation.

Example:

export COMFY_URL=http://your-comfy-server-url:port export COMFY_WORKFLOW_JSON_FILE=/path/to/the/comfyui_workflow_export.json export PROMPT_NODE_ID=6 # use the correct node id here export OUTPUT_NODE_ID=9 # use the correct node id here export OUTPUT_MODE=file

Usage

Comfy MCP Server can be launched by the following command:

uvx comfy-mcp-server

Example Claude Desktop Config

{ "mcpServers": { "Comfy MCP Server": { "command": "/path/to/uvx", "args": [ "comfy-mcp-server" ], "env": { "COMFY_URL": "http://your-comfy-server-url:port", "COMFY_WORKFLOW_JSON_FILE": "/path/to/the/comfyui_workflow_export.json", "PROMPT_NODE_ID": "6", "OUTPUT_NODE_ID": "9", "OUTPUT_MODE": "file", } } } }

Functionality

generate_image(prompt: str, ctx: Context) -> Image | str

This function generates an image using a specified prompt. It follows these steps:

  1. Checks if all the environment variable are set.
  2. Loads a prompt template from a JSON file.
  3. Submits the prompt to the Comfy server.
  4. Polls the server for the status of the prompt processing.
  5. Retrieves and returns the generated image once it's ready.

generate_prompt(topic: str, ctx: Context) -> str

This function generates a comprehensive image generation prompt from specified topic.

Dependencies

  • mcp: For setting up the FastMCP server.
  • json: For handling JSON data.
  • urllib: For making HTTP requests.
  • time: For adding delays in polling.
  • os: For accessing environment variables.
  • langchain: For creating simple LLM Prompt chain to generate image generation prompt from topic.
  • langchain-ollama: For ollama specific modules for LangChain.

License

This project is licensed under the MIT License - see the LICENSE file for details.

-
security - not tested
A
license - permissive license
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

The Comfy MCP Server uses the FastMCP framework to generate images from prompts by interacting with a remote Comfy server, allowing automated image creation based on workflow configurations.

  1. Overview
    1. Prerequisites
      1. Configuration
        1. Usage
          1. Example Claude Desktop Config
        2. Functionality
          1. generate_image(prompt: str, ctx: Context) -> Image | str
          2. generate_prompt(topic: str, ctx: Context) -> str
        3. Dependencies
          1. License

            Related MCP Servers

            • A
              security
              F
              license
              A
              quality
              A FastMCP server implementation that facilitates resource-based access to AI model inference, focusing on image generation through the Replicate API, with features like real-time updates, webhook integration, and secure API key management.
              Last updated -
              18
              15
              Python
              • Apple
            • A
              security
              A
              license
              A
              quality
              An MCP Server that integrates with Stability AI's API to provide high-quality image generation, editing, and manipulation capabilities including background removal, outpainting, search-and-replace, and upscaling.
              Last updated -
              13
              450
              73
              TypeScript
              MIT License
              • Apple
            • A
              security
              F
              license
              A
              quality
              An MCP server that generates images based on text prompts using Black Forest Lab's FLUX model, allowing for customized image dimensions, prompt upsampling, safety settings, and batch generation.
              Last updated -
              3
              1
              TypeScript
            • -
              security
              F
              license
              -
              quality
              An MCP image generation server based on the Flux Schnell model that provides API access for generating images from text prompts with customizable dimensions and seeds.
              Last updated -
              1
              Python
              • Linux
              • Apple

            View all related MCP servers

            MCP directory API

            We provide all the information about MCP servers via our MCP API.

            curl -X GET 'https://glama.ai/api/mcp/v1/servers/lalanikarim/comfy-mcp-server'

            If you have feedback or need assistance with the MCP directory API, please join our Discord server