HappyYang
HappyYang
when training, here happens 'Gradient overflow. Skipping step, loss scaler 0 reducing loss scale to'
Training
How can i train SDXL with own dataset, can you give me a help
If set: accumulation-steps > 1 and amp_status = '01' , there comes an error: _norm 0.2449 (0.2449) loss_scale 65536.0000 (65536.0000) mem 3750MB Traceback (most recent call last): File "main_simmim_pt.py", line...
Can I use multi gpus for training,and I find there is only the single-gpu training python file
Can you give me *.py file for extracting frames from video
Will the model weight be released?
I want to know why target normalized to [-1,1] and source normalized to [0,1], why there exists the difference?
After I click, it doesn't work, do you know this problem 