MiniCPM-V
MiniCPM-V copied to clipboard
fp16 or bf16 in LoRA fine-tuning ?
Thanks for your great work! When LoRA fine-tuning the MiniCPM-V-2_6, should I use fp16 or bf16 ? The default setting in 'finetune_lora.sh' is fp16, but the trained models you provided is bf16, so which one is better for LoRA fine-tuning ?
hi,if you can use bf16 in your machine you should choose bf16. This is just my personal opinion