ai-toolkit icon indicating copy to clipboard operation
ai-toolkit copied to clipboard

4090 run flux load_lora_weights error?

Open lonngxiang opened this issue 1 year ago • 3 comments

use python run.py config/whatever_you_want.yml works,but use The following code loads lora weight error,be killed image


from diffusers import AutoPipelineForText2Image
import torch

pipeline = AutoPipelineForText2Image.from_pretrained("/ai/FLUX.1-dev", torch_dtype=torch.bfloat16)
pipeline.enable_model_cpu_offload()

pipeline.load_lora_weights('/ai/ai-toolkit/output/my_first_flux_lora_v1', weight_name='my_first_flux_lora_v1_000001000.safetensors')
image = pipeline('a Yarn art style tarot card').images[0]

lonngxiang avatar Sep 05 '24 01:09 lonngxiang

@lonngxiang I think even with CPU offloading 24GB VRAM wouldn't be enough to get you there for inference without CUDA OOM.

16.5GB total - 4.5 for text encoder = 12 * 2bits = 24. OOM happens at around 95% (22.8GB) I think.

Just a guess here.

jlonge4 avatar Sep 06 '24 22:09 jlonge4

maybe you can try smaller weight and height param (such as 512) in pipline.

wotulong avatar Sep 12 '24 14:09 wotulong