CogVideo icon indicating copy to clipboard operation
CogVideo copied to clipboard

Loss Not Decreasing When Fine-Tuning CogVideoX Models with Default Settings

Open 123lcy123 opened this issue 9 months ago • 1 comments

Hello, I fine-tuned CogVideoX-2B-LoRA on my own dataset, which has around 10k samples. The learning rate was set to 1e-4 and the LoRA rank was set to 256, but the loss kept fluctuating around 0.1 and did not decrease. I also tried fine-tuning cogvideox-5b-SFT and encountered the same issue. The rest of the parameters were kept the same as those provided in the original script. Has anyone else experienced this? Are there any directions I could try modifying?

123lcy123 avatar Apr 25 '25 11:04 123lcy123

The batch_size is 32.

123lcy123 avatar Apr 25 '25 11:04 123lcy123