Vishnu V Jaddipal
Vishnu V Jaddipal
> @harishprabhala Awesome! > > I have tested on the small version, the generated result looks a little bit blurry, can not have a extremly clear effect compare with original,...
We're chalking up a technical paper for this, we plan on releasing any distillation code after. SSD 1-B is our rendition of a SDXL distilled model.
The problem here is that the Hugging Face save method saves the originally loaded unet-config, which does not represent the actual state of the unet-model once the model has been...
Update: I've added extra config params to the distill_training script. You can turn off the U-net shortening part by not passing `prepare_unet`. I am working on solving the config replacement...
I can second the above comment.
Will do! On Sun, Jul 7, 2024 at 09:15 Sayak Paul ***@***.***> wrote: > @Gothos can you follow > https://github.com/huggingface/diffusers/actions/runs/9824367283/job/27123254814?pr=8769#step:6:1 > to resolve the code quality failures? > > —...
Should be done now, ran ```make style && make quality``` and ```make fix-copies```.