Rbrq03

Results 16 comments of Rbrq03

Combing them seems a good idea. Could you share some images generated by this piece of code? I am not sure what it looks like.

Hey Saya, need a review for the new commit, I reformat this file @sayakpaul

I am sorry for that.😂 What happens to this commit? Should I do something to help it pass the CI?

I open a PR for fix this problem. In my local test, cross-attention weights can be loaded successfully in my local test

@daeunni you can try one of these methods: 1. simply uninstall PEFT if you don't use lora. 2. modify code in diffusers/loaders/unets.py as my PR do I hope it will...

I think the answer is yes, you can use lite.yaml as your config which may use model online. You need not to deploy and download any model.You can copy all...

Everyone, presently, this issue can be effectively addressed through the following steps: `pip uninstall diffusers ` `pip install diffusers==0.18.1` It works for me, hope it helps

It appears that the pretrained weights may not have been loaded correctly. You can try checking the 'pretrained_model_path' key in the config. Additionally, there might be an issue with the...