diffusers icon indicating copy to clipboard operation
diffusers copied to clipboard

Load safetensors directly to cuda

Open Daniel-Kelvich opened this issue 2 years ago • 0 comments

As far as I know there is not way right now to load a model from a safetensors file directly to cuda. You always have to load it to cpu first. Safetensors library supports loading directly to cuda, so it shouldn't be hard to add this functionality to diffusers pipelines.

Interface may look like this (just specify device in init function) pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16, device='cuda:0')

Daniel-Kelvich avatar Feb 21 '23 08:02 Daniel-Kelvich