Rbrq03

Results 5 issues of Rbrq03

### Describe the bug If you have PEFT installed in your environment, then custom_diffusion will not successfully load the cross-attention parameter, leading to a poor generation result. Given the time...

bug

# What does this PR do? This PR fixes the loading of cross-attention weights in custom diffusion models when PEFT is installed. This bug has been discussed in issue #7261...

Thank you for your exceptional work. lama now includes a refinement option, as detailed in the paper 'Feature Refinement to Improve High-Resolution Image Inpainting.' This update introduces enhanced refinement features...

Hey there. Congrats for the great work! My question is, in paper, you mentions the fusion in section 2.2that > we use 1x1 convolution and standard upsampling operation (e.g., bilinear/bicubic...

Hey there, congrats for the excellent work! Can anyone tell me what the batch size should be if i want to reproduce the training result in Paper? It seems that...