ColossalAI
ColossalAI copied to clipboard
[FEATURE]: LoRA with sharded model
Describe the feature
Hi, when training big model like llama2-70b with lora, it will run into oom due to the unsharded model.
It could help a lot if lora supported with GeminiPlugin or HybridParallelPlugin. Wonder if any plan to support that?
Hi, the HybridParallelPlugin has already support lora strategy. We would appreciate your feedback.