diffusers
diffusers copied to clipboard
Load safetensors directly to cuda
As far as I know there is not way right now to load a model from a safetensors file directly to cuda. You always have to load it to cpu first. Safetensors library supports loading directly to cuda, so it shouldn't be hard to add this functionality to diffusers pipelines.
Interface may look like this (just specify device in init function)
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16, device='cuda:0')