Can we do full training with 14b i2v model ?
Thank you very much for your framework, it’s very user-friendly. Regarding the training for wanx, is it possible to conduct full-parameter training on the 14B I2V model? I understand that the original model can definitely be trained with full parameters on 14B for an input size of 817201280, right?
@lith0613 To be honest, the current solution of fine-tuning the full 14B T2V model is already at its limit. The 14B I2V model requires slightly more GPU memory, which we currently cannot achieve. Our tensor parallelism framework is still under development, and we will continue to optimize this feature.
@lith0613 To be honest, the current solution of fine-tuning the full 14B T2V model is already at its limit. The 14B I2V model requires slightly more GPU memory, which we currently cannot achieve. Our tensor parallelism framework is still under development, and we will continue to optimize this feature.
Will the i2v LoRA training code be provided?
@zhuochen02 i2v LoRA training is provided in the same script. You only need to change the model path.