FSet89
FSet89
I tried to use load_lora_weights but it seems that the weights are ignored. How can I set the lora scale?
This seems to work. I'm not sure if `set_adapters` is necessary. EDIT: the output image does not seem to be affected by LoRA ``` pipe = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-schnell", torch_dtype=torch.bfloat16, device_map='balanced') pipe.load_lora_weights(lora_folder_path,...
No, I switched to [LLaVA](https://github.com/haotian-liu/LLaVA) where I didn't encounter it. However, I hope they fix it
Yes, I'm using that repo until this problem is identified/fixed
Any update on this?