Fix for "no lora weight found module" with some loras
What does this PR do?
When using some LoRAs, for example sd_xl_offset_example-lora_1.0.safetensors with blockwise scales we get this error:
RuntimeError: No LoRA weight found for module down_blocks.0.resnets.0.conv1
This PR makes it so if the layer is not found, it returns the weight (scale) of the block enabling the use of these LoRAs with blockwise scales.
e.g.,
offset_scales = {
"unet": {
"down": {"block_1": [0.0, 0.5], "block_2": [1.0, 1.0]},
"mid": 1.0,
"up": {"block_0": [1.0, 1.0, 1.0], "block_1": [1.0, 1.0, 1.0]},
},
"text_encoder": 1.0,
"text_encoder_2": 1.0,
}
- If
down_blocks.0.resnets.0.conv1is not found, returns thescaleof1.0as a default. - If
down_blocks.1.resnets.0.conv1is not found, returns thescaleof0.0 - If
down_blocks.1.resnets.1.conv1is not found, returns thescaleof0.5 - If
down_blocks.2.resnets.0.conv1is not found, returns thescaleof1.0
Fixes #7871
Who can review?
@sayakpaul @yiyixuxu @DN6
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
sure, I just noticed that I can make it better so I'll do the test and fix it a little more.
@sayakpaul is this test enough?
Thanks much @asomoza!