lora
lora copied to clipboard
[patch_pipe] UnboundLocalError: local variable '_tmp' referenced before assignment
I'm trying to work in kaggle notebooks.
File /opt/conda/lib/python3.10/site-packages/lora_diffusion/lora.py:1012, in patch_pipe(pipe, maybe_unet_path, token, r, patch_unet, patch_text, patch_ti, idempotent_token, unet_target_replace_module, text_target_replace_module)
1010 elif maybe_unet_path.endswith(".safetensors"):
1011 safeloras = safe_open(maybe_unet_path, framework="pt", device="cpu")
-> 1012 monkeypatch_or_replace_safeloras(pipe, safeloras)
1013 tok_dict = parse_safeloras_embeds(safeloras)
1014 if patch_ti:
File /opt/conda/lib/python3.10/site-packages/lora_diffusion/lora.py:809, in monkeypatch_or_replace_safeloras(models, safeloras)
806 print(f"No model provided for {name}, contained in Lora")
807 continue
--> 809 monkeypatch_or_replace_lora_extended(model, lora, target, ranks)
File /opt/conda/lib/python3.10/site-packages/lora_diffusion/lora.py:784, in monkeypatch_or_replace_lora_extended(model, loras, target_replace_module, r)
781 _tmp.conv.bias = bias
783 # switch the module
--> 784 _module._modules[name] = _tmp
786 up_weight = loras.pop(0)
787 down_weight = loras.pop(0)
UnboundLocalError: local variable '_tmp' referenced before assignment
This is the function called:
patch_pipe(
pipe,
os.getcwd()+"/lora/example_loras/lora_illust.safetensors",
patch_text=True,
patch_ti=True,
patch_unet=True,
)
If you can modify the source code this might be helpful.
But if not, downgrade the version of diffusers to v0.16.0 where the Attention module uses nn.Linear rather than the LoRACompatibleLinear which causes the issue above.