DriveLM icon indicating copy to clipboard operation
DriveLM copied to clipboard

Llama Adapter finetuning parameters

Open LCTPalmer opened this issue 10 months ago • 2 comments

Hi

Thank you so much for opening your data and code, it is very useful. I am trying to implement your method as a baseline, but want to be sure of training hyperparameters that you use. The exps/finetune.sh code includes some hard-coded hyperparameters (the same as the orinignal llama-adapter repo) - are you using these for the results in your paper? And if so are you training on the full training set in one go or using some other strategies? Any help appreciated.

Best regards Luke

LCTPalmer avatar Mar 17 '25 15:03 LCTPalmer

In the paper we did not provide experiments of llama-adapter on DriveLM. This (the llama-adapter code) is provided for the DriveLM challenge as a starter-code. We are using the full training set to train DriveLM-Agent, and more detail (training strategies) are in the paper.

ChonghaoSima avatar Mar 24 '25 16:03 ChonghaoSima

Thanks for your respnse. In the paper appendix section F.5 and Table 13 you mention training Llama-Adapter V2 and report results; I was hoping to benchmark against planning metric locally against your method so would be great to have those hyperparams (as I am adding some logic to llama-adapter in hopes of general improvement). Alternatively, is there an entry in the leaderboard for vanilla Llama-AdapterV2 I could reference?

Best regards Luke

LCTPalmer avatar Mar 25 '25 09:03 LCTPalmer