DiffSynth-Studio
DiffSynth-Studio copied to clipboard
OOM when LORA training Wan2.2-I2V-A14B with 2-H20
Hi all, I’m encountering OOM issues when using LoRA to finetune Wan2.2-I2V-A14B with the script lora/Wan2.2-I2V-A14B.sh on 2×80 GB H20 GPUs.
Is this expected? I noticed in the documentation that a single 80 GB GPU should be sufficient for LoRA training, so I’m wondering if I might be missing something.