ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

newbie image exp - ksampler "pooled_output"

Open manassm opened this issue 4 months ago • 6 comments

I've been trying to run newbie image exp 0.1 on comfy desktop but never succeeded. Below are my attempted workflow, error log, and some screenshot for reference to my setup. Any help would be appreciated. Thank you very much.

newbie wip.json error log.txt

Image Image Image Image Image Image

manassm avatar Dec 19 '25 00:12 manassm

Use the unet loader in newbie nodes

SuLU-K avatar Dec 19 '25 02:12 SuLU-K

@SuLU-K Thank you, but it gives me another error:

Image

manassm avatar Dec 19 '25 04:12 manassm

@SuLU-K Thank you, but it gives me another error:

Image

Flash-attention and triton is necessary for the newbie model. You can find the flash-attention whl file at https://github.com/kingbri1/flash-attention. Note that it corresponds to cuda, PyTorch and Python versions.

SuLU-K avatar Dec 19 '25 04:12 SuLU-K

Thank you, but installing flash attention fails for me. Attached is the error log. Please let me know if I'm doing anything wrong. Thanks.

flash attention error.txt

manassm avatar Dec 19 '25 05:12 manassm

Thank you, but installing flash attention fails for me. Attached is the error log. Please let me know if I'm doing anything wrong. Thanks.

flash attention error.txt

Try to find the whl that meets your comfyui environment directly in the release, and install it through pip. You can refer to the "Install necessary components" chapter of the newbie official guide (https://ai.feishu.cn/wiki/NZl9wm7V1iuNzmkRKCUcb1USnsh)

SuLU-K avatar Dec 19 '25 05:12 SuLU-K

Thank you for your help. I've been trying to install flash attention in my comfy-desktop (.exe) environment but kept failing. Below is an AI-summary of my current situation:

[Bug] ImportError: DLL load failed for flash_attn_2_cuda breaking core nodes (ComfyUI Desktop App) System Information

ComfyUI Version: 0.4.0 (Desktop App)

OS: Windows 11

GPU: NVIDIA GeForce RTX 4080 (16GB VRAM)

Python Version: 3.12.10

PyTorch Version: 2.8.0+cu129

Python Executable: C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe

Describe the bug

After attempting to set up requirements for a new model, I am encountering import failures for core ComfyUI nodes (nodes_canny.py and nodes_morphology.py). The root cause appears to be an ImportError in the flash_attn dependency which is triggered when kornia is imported by these nodes. Error Log Plaintext

Traceback (most recent call last): ... File "C:\Users\SEPHiA\Documents\ComfyUI.venv\Lib\site-packages\kornia\feature\lightglue.py", line 48, in from flash_attn.modules.mha import FlashCrossAttention File "C:\Users\SEPHiA\Documents\ComfyUI.venv\Lib\site-packages\flash_attn_init_.py", line 3, in from flash_attn.flash_attn_interface import ( File "C:\Users\SEPHiA\Documents\ComfyUI.venv\Lib\site-packages\flash_attn\flash_attn_interface.py", line 15, in import flash_attn_2_cuda as flash_attn_gpu ImportError: DLL load failed while importing flash_attn_2_cuda: 지정된 프로시저를 찾을 수 없습니다. (The specified procedure could not be found.)

IMPORT FAILED: nodes_canny.py IMPORT FAILED: nodes_morphology.py

Additional Context

I am using the ComfyUI Desktop App, not the Portable version.

The environment is the internal .venv created by the Desktop installer.

My system has CUDA 12.1 installed globally, but the ComfyUI Desktop log shows 2.8.0+cu129.

It seems kornia has a hard dependency chain leading to flash_attn, and because the flash_attn binary is incompatible with the current Desktop App environment, it prevents the standard comfy_extras nodes from loading.

manassm avatar Dec 19 '25 20:12 manassm

This error is usually caused by a binary incompatibility between flash-attn and your current PyTorch / CUDA / build environment.

Based on your environment, you should use the following wheel:

flash_attn-2.8.3+cu128torch2.8.0cxx11abiFALSE-cp312-cp312-win_amd64.whl

Please make sure you install it using ComfyUI’s own Python environment, not the system Python. If you have already installed an incompatible version, uninstall it first. Then install the correct wheel:

C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe -m pip install flash_attn-2.8.3+cu128torch2.8.0cxx11abiFALSE-cp312-cp312-win_amd64.whl

SuLU-K avatar Dec 20 '25 06:12 SuLU-K

Thanks for the help. I tried but it gives me slightly different error:

NewBieCLIPTextEncode FlashAttention is not installed. To proceed with training, please install FlashAttention. For inference, you have two options: either install FlashAttention or disable it by setting use_flash_attn=False when loading the model.

(I installed via: C:\Users\SEPHiA\Documents\ComfyUI.venv\Scripts\python.exe -m pip install flash_attn-2.8.3+cu128torch2.8.0cxx11abiFALSE-cp312-cp312-win_amd64.whl)

error log.txt

manassm avatar Dec 20 '25 06:12 manassm

Thanks for the clarification. Just to note: on my machine with the same ComfyUI setup (PyTorch 2.8.0 + cu129), the cu128 FlashAttention wheel works correctly, so this doesn’t look like a strict cu129/cu128 incompatibility.

Since the error is raised by a runtime check inside the Jina “flash” implementation, it usually means FlashAttention is not being detected as available in the environment ComfyUI is actually running.

Could you please double-check:

  1. FlashAttention is installed in the exact Python/venv used by ComfyUI:
C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe -m pip show flash-attn
C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe -c "import flash_attn; print(flash_attn.__version__)"
  1. You downloaded both CLIP models exactly as described in the official guide (Jina CLIP + Gemma), with no missing or renamed files.
  2. ComfyUI-Newbie-Nodes is updated to the latest version, as older versions did not always handle FlashAttention fallback correctly.

SuLU-K avatar Dec 20 '25 07:12 SuLU-K

Thank you for your prompt response:

  1. It seems flash attention is installed, but comfy desktop still doesn't recognize it. Rebooting PC didn't help either.: Microsoft Windows [Version 10.0.22631.6199] (c) Microsoft Corporation. All rights reserved.

C:\Users\SEPHiA>C:\Users\SEPHiA\Documents\ComfyUI.venv\Scripts\python.exe -m pip show flash-attn Name: flash_attn Version: 2.8.3 Summary: Flash Attention: Fast and Memory-Efficient Exact Attention Home-page: https://github.com/Dao-AILab/flash-attention Author: Tri Dao Author-email: [email protected] License: Location: C:\Users\SEPHiA\Documents\ComfyUI.venv\Lib\site-packages Requires: einops, torch Required-by:

C:\Users\SEPHiA>C:\Users\SEPHiA\Documents\ComfyUI.venv\Scripts\python.exe -c "import flash_attn; print(flash_attn.version)" 2.8.3

C:\Users\SEPHiA>

comfy desktop log.txt

  1. My jina and gemma was actually under newbie-nodes/models, but moving it to comfyui base model folder didn't seem to help. (Maybe the difference between the guide and I were batch downloading with git clone instead of manual download, but I renamed the folder and all the files in it seems to have same name as original so I don't think it should matter.)

  2. I just updated the node (manager install was not available on comfy desktop so i just did git clone https://github.com/NewBieAI-Lab/ComfyUI-Newbie-Nodes in custom_node folder, and there was no requirement.txt to install), but didn't help.

Image Image Image Image

Thank you for your help. Please let me know if I misunderstood anything.

manassm avatar Dec 20 '25 08:12 manassm

It looks like FlashAttention is installed, but not in the same venv that ComfyUI Desktop is actually using. Your pip show is using ...\ComfyUI.venv\Scripts\python.exe, while the Desktop log shows ComfyUI runs with ...\ComfyUI\.venv\Scripts\python.exe.

Please run the checks with the exact python.exe from the log:

C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe -m pip show flash-attn
C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe -c "import flash_attn; print(flash_attn.version)"
C:\Users\SEPHiA\Documents\ComfyUI\.venv\Scripts\python.exe -c "import flash_attn_2_cuda; print('flash_attn_2_cuda OK')"

If pip show returns nothing there, install the wheel using that same python.exe.

SuLU-K avatar Dec 20 '25 08:12 SuLU-K

The first line passes but second and third lines doesn't seem to pass for some reason. I tried installing again but it didn't help. I guess that ComfyUI.venv was typo - my bad.

cmd log.txt

manassm avatar Dec 20 '25 08:12 manassm

Your pip show flash-attn is fine, but the real problem is here: ImportError: DLL load failed while importing flash_attn_2_cuda

This means FlashAttention is installed, but the native CUDA extension can’t be loaded on Windows (most commonly missing MSVC runtime DLLs).

You should try install Microsoft Visual C++ Redistributable 2015–2022 (x64) and reboot first. And then re-test.

SuLU-K avatar Dec 20 '25 08:12 SuLU-K

I checked it, and you need to install some Windows components.

I recommend installing the MSVC toolchain and Windows SDK via Visual Studio Installer:

  1. Open Visual Studio Installer → Modify → Individual components
  2. Search msvc and install the latest MSVC v143 (VS 2022) C++ x64/x86 build tools
  3. Search windows and install the latest Windows 10/11 SDK
  4. Apply changes and reboot
  5. After reboot, re-test:

This often resolves DLL load failed issues for FlashAttention on Windows.

SuLU-K avatar Dec 20 '25 08:12 SuLU-K

All of them seems to be installed, but it still doesn't work.

I initially had VS community 2022 and VS build tools 2019 with both of them showing latest msvc and win11 (AND win10) SDK installed. It didn't work, so I tried installing VS community 2026, which didn't work either.

I'm not sure if it's relevant but on current version of comfy desktop there are multiple nodes that don't work; here is my issue FYI: https://github.com/Comfy-Org/desktop/issues/1485#issuecomment-3654796435

Thank you for your help regardless. I hope I'm not giving you any pressure.

Image Image

manassm avatar Dec 20 '25 10:12 manassm

With the current setup, FlashAttention itself should be able to work on Windows, and the installation you have now looks correct.

One important clarification: the error you’re seeing in ComfyUI is raised explicitly in the model code (via a raise RuntimeError(...)), and does not necessarily mean FlashAttention itself is broken. In other words, this can also happen if the runtime checks inside the node / model wrapper fail, even when FlashAttention is present.

To help narrow this down, could you confirm whether the following dependencies mentioned in official guidence are installed? - ninja - triton-windows

SuLU-K avatar Dec 20 '25 11:12 SuLU-K

I guess it was missing triton causing the problem. Running the following command solved the problem. Thank you for all your help.

Microsoft Windows [Version 10.0.22631.6199] (c) Microsoft Corporation. All rights reserved.

C:\Users\SEPHiA>cd "C:\Users\SEPHiA\Documents\ComfyUI"

C:\Users\SEPHiA\Documents\ComfyUI>..venv\Scripts\python.exe -m pip install ninja Requirement already satisfied: ninja in c:\users\sephia\documents\comfyui.venv\lib\site-packages (1.13.0)

[notice] A new release of pip is available: 25.0.1 -> 25.3 [notice] To update, run: C:\Users\SEPHiA\Documents\ComfyUI.venv\Scripts\python.exe -m pip install --upgrade pip

C:\Users\SEPHiA\Documents\ComfyUI>..venv\Scripts\python.exe -m pip install triton-windows Collecting triton-windows Downloading triton_windows-3.5.1.post22-cp312-cp312-win_amd64.whl.metadata (1.8 kB) Downloading triton_windows-3.5.1.post22-cp312-cp312-win_amd64.whl (46.5 MB) ---------------------------------------- 46.5/46.5 MB 92.4 MB/s eta 0:00:00 Installing collected packages: triton-windows Successfully installed triton-windows-3.5.1.post22

[notice] A new release of pip is available: 25.0.1 -> 25.3 [notice] To update, run: C:\Users\SEPHiA\Documents\ComfyUI.venv\Scripts\python.exe -m pip install --upgrade pip

C:\Users\SEPHiA\Documents\ComfyUI>..venv\Scripts\python.exe -m pip list | findstr "ninja triton" ninja 1.13.0 triton-windows 3.5.1.post22

C:\Users\SEPHiA\Documents\ComfyUI>

manassm avatar Dec 20 '25 12:12 manassm