[feat] add `load_lora_adapter()` for compatible models
What does this PR do?
Similar to load_attn_procs(), we want to have something similar for loading LoRAs into models, as the LoRA loading logic is generic.
This way, we can reduce the LoC and have better maintainability. I am not too fixated on the load_lora_adapter() name. Could also do load_adapter().
@DN6 as discussed via Slack, could you give this a check? Could also add a save_lora_adapter() method to complement.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
@DN6 LMK what you think of the latest changes.
Additionally, what do you think about the save_lora_adapter() method? Can do in another PR, LMK.
@BenjaminBossan could you give this a look too?
Ran the Flux integration tests and they pass. Failing tests are unrelated.