ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

LoRAAdapter has no len()

Open MagicL911 opened this issue 8 months ago • 2 comments

Expected Behavior

what's wrong? Image but lora can still be used in V0.3.29

Actual Behavior

what's wrong? Image but lora can still be used in V0.3.29

Steps to Reproduce

run any lora after V0.3.29

Debug Logs

# ComfyUI Error Report
## Error Details
- **Node ID:** 260
- **Node Type:** KSampler //Inspire
- **Exception Type:** TypeError
- **Exception Message:** object of type 'LoRAAdapter' has no len()
## Stack Trace

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\execution.py", line 349, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\execution.py", line 224, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\execution.py", line 196, in _map_node_over_list
    process_inputs(input_dict, i)

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\execution.py", line 185, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\comfyui-Inspire-Pack\inspire\a1111_compat.py", line 157, in doit
    return (inspire_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise, noise_mode,
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\comfyui-Inspire-Pack\inspire\a1111_compat.py", line 116, in inspire_ksampler
    raise e

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\comfyui-Inspire-Pack\inspire\a1111_compat.py", line 104, in inspire_ksampler
    samples = common.impact_sampling(
              ^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\comfyui-Inspire-Pack\inspire\libs\common.py", line 17, in impact_sampling
    return nodes.NODE_CLASS_MAPPINGS['RegionalSampler'].separated_sample(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\special_samplers.py", line 312, in separated_sample
    return separated_sample(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 180, in separated_sample
    res = sample_with_custom_noise(model, add_noise, seed, cfg, positive, negative, impact_sampler, sigmas, latent_image, noise=noise, callback=callback)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\impact_sampling.py", line 124, in sample_with_custom_noise
    samples = comfy.sample.sample_custom(model, noise, cfg, sampler, sigmas, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\comfyui-diffusion-cg\recenter.py", line 45, in sample_center
    return SAMPLE(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\sample.py", line 50, in sample_custom
    samples = comfy.samplers.sample(model, noise, positive, negative, cfg, model.load_device, sampler, sigmas, model_options=model.model_options, latent_image=latent_image, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 149, in sample
    return orig_fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\samplers.py", line 1023, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\samplers.py", line 1008, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\patcher_extension.py", line 111, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\samplers.py", line 963, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\sampler_helpers.py", line 113, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\patcher_extension.py", line 111, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\sampler_helpers.py", line 122, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required, minimum_memory_required=minimum_memory_required)

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_management.py", line 625, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_management.py", line 444, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_management.py", line 473, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_patcher.py", line 832, in partially_load
    raise e

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_patcher.py", line 829, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_patcher.py", line 664, in load
    self.patch_weight_to_device("{}.{}".format(n, param), device_to=device_to)

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\comfy\model_patcher.py", line 561, in patch_weight_to_device
    out_weight = comfy.lora.calculate_weight(self.patches[key], temp_weight, key)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\custom_nodes\Fooocus_Nodes\py\modules\patch.py", line 126, in calculate_weight_patched
    if len(v) == 1:
       ^^^^^^


## System Information
- **ComfyUI Version:** 0.3.35
- **Arguments:** C:\Aibak\ComfyUI-aki-v1.6\ComfyUI\main.py --auto-launch --preview-method auto --disable-cuda-malloc
- **OS:** nt
- **Python Version:** 3.11.9 (tags/v3.11.9:de54cf5, Apr  2 2024, 10:12:12) [MSC v.1938 64 bit (AMD64)]
- **Embedded Python:** false
- **PyTorch Version:** 2.5.1+cu124
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 3080 Ti Laptop GPU : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 17179344896
  - **VRAM Free:** 14051589674
  - **Torch VRAM Total:** 1845493760
  - **Torch VRAM Free:** 8011306

Other

No response

MagicL911 avatar May 23 '25 01:05 MagicL911

It might be that the code for Fooocus_Nodes is a bit outdated.

HelloClyde avatar May 23 '25 02:05 HelloClyde

It might be that the code for Fooocus_Nodes is a bit outdated.

Ic-light and fooocus nodes , If LoRA is used synchronously, this problem will occur. It didn't happen before. Has something been changed in the ComfyUI itself?

cardenluo avatar May 23 '25 15:05 cardenluo

in my case problem coming from ic-light nodes by Kijai. And that's a big problem since my workflow relies heavily on his nodes. And also problem coming not from Clip, but Ksampler. Any solution?

Gavr728 avatar May 30 '25 17:05 Gavr728

mine too, this is horrible! i wish someone can just fix the old Lora Stacker

DarqueLilly avatar May 30 '25 22:05 DarqueLilly

in my case problem coming from ic-light nodes by Kijai. And that's a big problem since my workflow relies heavily on his nodes. And also problem coming not from Clip, but Ksampler. Any solution?

This is the solution, which may require modifying some code.

https://github.com/kijai/ComfyUI-IC-Light/issues/79#issuecomment-2846137173

HelloClyde avatar May 31 '25 02:05 HelloClyde

Thank you guys,it can be worked after uninstall these Nodes

MagicL911 avatar Jun 06 '25 06:06 MagicL911