[Feature request] loha support
When i use LoraLoader to load loha,it happened some errs.
loha can find in here ↓ https://github.com/KohakuBlueleaf/LyCORIS
Greetings!
First a bit offtopic, comfyanonymous, you are doing a outstanding work here. I have been since beginning with auto1111´s web-ui, but even if your UI does have a few less features (it ain´t much), the additional possibilities your UI offers are, my opinion, far above auto1111´s web-ui! Keep on going, this is just hilarious!
About this Feature Request: I am fully with sdbds and was about to open the same Feature request.
A implementation to be able to use LoRA with Hadamard Product representation (LoHa) would be just awesome. After own tests and trainings of LoRAs, LoCons and LoHas, my personal impression is, LoHas are returning the best results of these 3 methods.
LoHas can be normally trained by kohya-ss sd-scripts (https://github.com/kohya-ss/sd-scripts) when including the addtion of https://github.com/KohakuBlueleaf/LyCORIS (see README.md, "usage")
A shortlink to a explanation of the algorithms is here: https://github.com/KohakuBlueleaf/LyCORIS/blob/main/Algo.md
I am quite interested in "the whole thing".I tried to have a own look about this and compare some things - but sadly, i am completely lost about that.
I only know (and understand) the very basics of the Algorithms and techniques which are behind stable-diffusion. But also sadly, it´s about the same for programming in python. I´m not a programmer, so my skills are quite low...
I would love to help with this feature, or help at the whole project at all - but due to my lack of skills, i dont even know where to start or how to improve them for this (sd) specific topic.
So, if you or someone else here does have a idea how to add LoHa support, this would be a truely great addtion to your already outstanding UI!
Best regards
Darkflo264
Update: Because if someone might be looking at this, i thought some more infos might be useful. I have added a text file with the exact lines of errors about keys i get when trying to load a LoHa within the LoraLoader node. For testing cases i added a second file, a zip file including a self-trained LoHa based on SD 1.5 with training images of my dog, resulting in quite good results in auto1111. (medium sized dog, black and white, hair length above average) Trigger when using should be Timmy or Timmy dog (if this might not work, its also available on my google drive: https://drive.google.com/file/d/1E2vxUrrXYbMUk2dPtz3Yif09XJb3351y/view?usp=sharing )
I think this feature should be prioritized a bit higher, it's really useful
That’s why I tried to add more information and example data about this topic then the opener.
Tell me if I can provide any additional info
I added loha support, test it and tell me if it works: https://github.com/comfyanonymous/ComfyUI/commit/94a7c895f41944d60fc3f99355064fac8347b006
我添加了 loha 支持,测试它并告诉我它是否有效:94a7c89
lora key not loaded lora_unet_down_blocks_0_downsamplers_0_conv.hada_t1
lora key not loaded lora_unet_down_blocks_0_downsamplers_0_conv.hada_t2
lora key not loaded lora_unet_down_blocks_0_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_0_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_0_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_0_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_0_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_0_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_0_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_0_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_1_downsamplers_0_conv.hada_t1
lora key not loaded lora_unet_down_blocks_1_downsamplers_0_conv.hada_t2
lora key not loaded lora_unet_down_blocks_1_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_1_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_1_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_1_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_1_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_1_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_1_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_1_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_2_downsamplers_0_conv.hada_t1
lora key not loaded lora_unet_down_blocks_2_downsamplers_0_conv.hada_t2
lora key not loaded lora_unet_down_blocks_2_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_2_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_2_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_2_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_2_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_2_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_2_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_2_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_3_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_3_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_3_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_3_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_down_blocks_3_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_down_blocks_3_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_down_blocks_3_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_down_blocks_3_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_mid_block_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_mid_block_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_mid_block_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_mid_block_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_mid_block_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_mid_block_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_mid_block_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_mid_block_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_0_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_0_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_0_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_0_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_0_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_0_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_0_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_0_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_0_resnets_2_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_0_resnets_2_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_0_resnets_2_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_0_resnets_2_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_0_upsamplers_0_conv.hada_t1
lora key not loaded lora_unet_up_blocks_0_upsamplers_0_conv.hada_t2
lora key not loaded lora_unet_up_blocks_1_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_1_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_1_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_1_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_1_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_1_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_1_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_1_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_1_resnets_2_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_1_resnets_2_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_1_resnets_2_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_1_resnets_2_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_1_upsamplers_0_conv.hada_t1
lora key not loaded lora_unet_up_blocks_1_upsamplers_0_conv.hada_t2
lora key not loaded lora_unet_up_blocks_2_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_2_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_2_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_2_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_2_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_2_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_2_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_2_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_2_resnets_2_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_2_resnets_2_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_2_resnets_2_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_2_resnets_2_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_2_upsamplers_0_conv.hada_t1
lora key not loaded lora_unet_up_blocks_2_upsamplers_0_conv.hada_t2
lora key not loaded lora_unet_up_blocks_3_resnets_0_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_3_resnets_0_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_3_resnets_0_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_3_resnets_0_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_3_resnets_1_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_3_resnets_1_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_3_resnets_1_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_3_resnets_1_conv2.hada_t2
lora key not loaded lora_unet_up_blocks_3_resnets_2_conv1.hada_t1
lora key not loaded lora_unet_up_blocks_3_resnets_2_conv1.hada_t2
lora key not loaded lora_unet_up_blocks_3_resnets_2_conv2.hada_t1
lora key not loaded lora_unet_up_blocks_3_resnets_2_conv2.hada_t2
Traceback (most recent call last):
File "E:\ComfyUI\execution.py", line 174, in execute
executed += recursive_execute(self.server, prompt, self.outputs, x, extra_data)
File "E:\ComfyUI\execution.py", line 54, in recursive_execute
executed += recursive_execute(server, prompt, outputs, input_unique_id, extra_data)
File "E:\ComfyUI\execution.py", line 54, in recursive_execute
executed += recursive_execute(server, prompt, outputs, input_unique_id, extra_data)
File "E:\ComfyUI\execution.py", line 63, in recursive_execute
outputs[unique_id] = getattr(obj, obj.FUNCTION)(**input_data_all)
File "E:\ComfyUI\nodes.py", line 685, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "E:\ComfyUI\nodes.py", line 616, in common_ksampler
model_management.load_model_gpu(model)
File "E:\ComfyUI\comfy\model_management.py", line 127, in load_model_gpu
raise e
File "E:\ComfyUI\comfy\model_management.py", line 124, in load_model_gpu
real_model = model.patch_model()
File "E:\ComfyUI\comfy\sd.py", line 315, in patch_model
weight += (alpha * torch.mm(w1a.float(), w1b.float()) * torch.mm(w2a.float(), w2b.float())).reshape(weight.shape).type(weight.dtype).to(weight.device)
RuntimeError: mat1 and mat2 shapes cannot be multiplied (1x320 and 1x320)
Can confirm, same here.
These lohas should work now: https://github.com/comfyanonymous/ComfyUI/commit/dd095efc2c172000a840c7ca54c8a01458f6b2a3
I, again, can confirm. This looks really good. have tried with 2 loha so far, did not encounter a problem. Might try it with SD 2.1 later today.
Thank you for your time investment. That´s just awesome. Right now, i do not see much features your UI lacks compared to auto´s :)
I see, i really needs to head deeper into this materies and learn python. I hope you are fine with it if i take a look at your code for the implementation and compare it with my (failed) experiments about that.
As confirmation, i dare to add 3 images i just created with a loha (maybe i overtrained it a bit meanwhile or selected a bad model for the last one):

BTW it looks like now it doesn't support sparse bias or diff? (diff is plain W', I use it for extract conv_in and conv_out, will use this to train conv_in/conv_out in the future)
can you link me one that uses "sparse bias or diff" ?