diffusers icon indicating copy to clipboard operation
diffusers copied to clipboard

Loading checkpoint dreambooth fine tuned sd from local disk

Open javismiles opened this issue 3 years ago • 3 comments

hi team, Im making many experiments fine tuning SD with dreambooth and I want to load the models directly from my local disk, im using the code from : https://github.com/nateraw/stable-diffusion-videos

So I download the entire SD 1.5, repo doing: git lfs install git clone https://huggingface.co/runwayml/stable-diffusion-v1-5

and then I replace the checkpoint there by my fine tuned one, and then try to run:

pipeline = StableDiffusionWalkPipeline.from_pretrained( "/home/projects/ai/pr71/sd/stable-diffusion-v1-5", torch_dtype=torch.float16, revision="fp16", ).to("cuda")

and I get no errors, but the model being applied is still the standard 1.5, not my dreambooth fine tuned one

whats the easiest way to keep using StableDiffusionWalkPipeline but loading a local SD 1.5 checkpoint from my local disk?

thank you very much

javismiles avatar Jan 16 '23 00:01 javismiles

I just found that your code is not loading a .ckpt, instead it is loading .bin files, like : diffusion_pytorch_model.bin

so how can I convert my .ckpt dreambooth file to your .bin format?

in that way I could replace your .bin with mine

thank you

javismiles avatar Jan 16 '23 00:01 javismiles

I found the solution, for others that may be looking for the same, here it is:

a) download this script: https://raw.githubusercontent.com/huggingface/diffusers/main/scripts/convert_original_stable_diffusion_to_diffusers.py

b) put your dreambooth fine tuned model in a new folder

c) then run the script like this: python conv.py --checkpoint_path ./dreamboothmodel.ckpt --dump_path . The script will create a folder structure with everything necessary to run the model in diffusers

d) then you can load that model with: pipeline = StableDiffusionWalkPipeline.from_pretrained( "/local-path-to-the-folder-you-just-created", torch_dtype=torch.float16, revision="fp16", ).to("cuda")

and voila, it works :)

javismiles avatar Jan 16 '23 01:01 javismiles

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

github-actions[bot] avatar Feb 15 '23 15:02 github-actions[bot]