Fix custom_diffusion with PEFT installed issue#7261
What does this PR do?
This PR fixes the loading of cross-attention weights in custom diffusion models when PEFT is installed. This bug has been discussed in issue #7261
Fixes #7261(issue)
Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the contributor guideline?
- [ ] Did you read our philosophy doc (important for complex PRs)?
- [x] Was this discussed/approved via a GitHub issue or the forum? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
- [ ] Did you write any new necessary tests?
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @sayakpaul
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
LGTM! @yiyixuxu WDYT?
Hey Saya, need a review for the new commit, I reformat this file @sayakpaul
It's Sayak not "Saya" 😅 Will trigger the CI now.
I am sorry for that.😂
What happens to this commit? Should I do something to help it pass the CI?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
can you update the quality dependency and then make style again?