sd-scripts icon indicating copy to clipboard operation
sd-scripts copied to clipboard

Padding LoRA ranks to enable merging

Open AI-Casanova opened this issue 3 years ago • 1 comments

I'm quite shaky on the math, but would it be possible to pad a lower rank LoRA (with zeros, or something else) in order to merge a lower rank LoRA with a higher rank?

I've been playing with SVD distillation at 256 rank, and might want to merge in a trained LoRA, but don't have the time or resources to fully train a high rank LoRA.

AI-Casanova avatar Jan 27 '23 16:01 AI-Casanova

I am not familiar with the mathematical theory, but I think the same formula for merging LoRA weights into the model weights could be used. I will make investigations.

kohya-ss avatar Jan 30 '23 03:01 kohya-ss