DeepLearningExamples icon indicating copy to clipboard operation
DeepLearningExamples copied to clipboard

[FastPitch1.1/pytorch] learning rate/warm up based on the data size ?

Open dsplog opened this issue 3 years ago • 1 comments

Describe the bug This is not a bug, but trying to understand

To Reproduce Steps to reproduce the behavior:

If the training data size is larger inclduing LibriTTS clean, should we play with the parameters like : ${WARMUP_STEPS:=1000} : ${KL_LOSS_WARMUP:=100} : ${LEARNING_RATE:=0.1}

?

dsplog avatar May 31 '22 02:05 dsplog

I am not sure, I think the warmup steps just want to find a better "draft" of model weights, there is no need to see every sample in the dataset in this process.

But what confused me is that, in the inferece.py, there is a warmup setup. "warmup iterations before measuring performance", Is that required? What would happen if we do not?

JohnHerry avatar May 31 '22 03:05 JohnHerry