justin-prnd

Results 4 comments of justin-prnd

If you can modify the source code [this](https://github.com/cloneofsimo/lora/issues/259#issuecomment-1733075835) might be helpful. But if not, downgrade the version of diffusers to v0.16.0 [where the `Attention` module uses `nn.Linear`](https://github.com/huggingface/diffusers/blob/6ba0efb9a188b08f5b46565a87c0b3da7ff46af4/src/diffusers/models/attention_processor.py#L34-L118) rather than the...

> query = attn.to_q(hidden_states, scale=scale) [The latest diffusers adopts `LoRACompatibleLinear` module rather than `nn.Linear`.](https://github.com/huggingface/diffusers/blob/48664d62b8e9f70d03b1be4059c1464a3b167f85/src/diffusers/models/attention_processor.py#L140) The forward method of `LoRACompatibleLinear`, however, [has an additional kwargs `scale`](https://github.com/huggingface/diffusers/blob/48664d62b8e9f70d03b1be4059c1464a3b167f85/src/diffusers/models/lora.py#L229-L235) which is not included in...

**Modification for inference** After training, I faced a similar issue for inference. The `patch_pipe` method would fail with the following error. ``` ... line 784, in monkeypatch_or_replace_lora_extended _module._modules[name] = _tmp...

> @justin-prnd where is the LoRACompatibleLinear defined? I get module not defined error? Defined at https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/lora.py#L181 and can be imported as ```python3 from diffusers.models.lora import LoRACompatibleLinear ``` if `diffusers >=...