The hyper-parameters of transfer learning
Thanks for a outstanding work, transfer learning is a very useful setup in this paper. But the performance in the table seems to require appropriate hyperparameters, can you release the hyperparameters in the transfer learning setting? Thanks again for sharing
hi, @jiaruHithub thanks for your interest. The hyperparameters used in transfer learning are exactly the default parameters in the file, pretraining.py and finetune.py in this directory . Or you can just use the checkpoint we provieded.
hi, @jiaruHithub thanks for your interest. The hyperparameters used in transfer learning are exactly the default parameters in the file,
pretraining.pyandfinetune.pyin this directory . Or you can just use the checkpoint we provieded.
Thanks, now I use the pretrianed model you provided, but there is many transfer dataset in paper, all the hyper-parameters for different tranfer dataset is same ? Like the all hyper-parameters in finetune.py for transfer to bbbp and bace is same ?
hi, @jiaruHithub thanks for your interest. The hyperparameters used in transfer learning are exactly the default parameters in the file,
pretraining.pyandfinetune.pyin this directory . Or you can just use the checkpoint we provieded.
Thanks, now I use the pretrianed model you provided, but there is many transfer dataset in paper, all the hyper-parameters for different tranfer dataset is same ? Like the all hyper-parameters in finetune.py for transfer to bbbp and bace is same ?
These datasets share most hyper-parameters. Specially, we use learning_rate decay for bace, and early-stopping for muv/hiv. The two operations have been implemented in our code. And all other hyper-parameters are the same across all datasets. And we will update this soon.
Thanks a lot !