Ask-Anything icon indicating copy to clipboard operation
Ask-Anything copied to clipboard

ValueError: Attempting to unscale FP16 gradients when fully finetunning on LLM without LoRA?

Open dragen1860 opened this issue 1 year ago • 2 comments

dear all: when i disable the lora and set whole Mistral LLm weights requires_grad=True, it gives me the error:

ValueError: Attempting to unscale FP16 gradients

anyone give some tips?

dragen1860 avatar Jul 03 '24 03:07 dragen1860

I meet the same problem

Qnancy avatar Aug 20 '24 14:08 Qnancy

have you solved it?

Qnancy avatar Aug 20 '24 14:08 Qnancy