InternVL icon indicating copy to clipboard operation
InternVL copied to clipboard

[Bug] Weight key issue when using lora fine-tuning(Already fixed)

Open Lillianwei-h opened this issue 1 year ago • 1 comments

Checklist

  • [X] 1. I have searched related issues but cannot get the expected help.
  • [X] 2. The bug has not been fixed in the latest version.
  • [X] 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.

Describe the bug

Issue

Due to the use of PEFT, the key names of the saved weights after lora training are inconsistent with the original ones, where language.model becomes language.base_model.model.

Fix

Before saving the weights at the end of training, I use model.language_model = model.language_model.merge_and_unload() and everything looks fine. I hope you can add this in future updates~

Reproduction

Already fixed

Environment

Already fixed

Error traceback

No response

Lillianwei-h avatar Sep 20 '24 01:09 Lillianwei-h

We appreciate you bringing this issue to our attention. We will conduct a thorough investigation and provide an update as soon as possible. Should we identify a bug, we will implement the necessary code changes. Thank you for your continued support.

qishisuren123 avatar Sep 21 '24 08:09 qishisuren123

In addition, peft==0.4.0 will not got this problem.

yinglang avatar Dec 06 '24 04:12 yinglang

Hi, since there hasn't been any recent activity on this issue, I'll be closing it for now. If it's still an active concern, don't hesitate to reopen it. Thanks for your understanding!

czczup avatar Dec 09 '24 11:12 czczup