DeepLearningExamples
DeepLearningExamples copied to clipboard
[FastPitch1.1/pytorch] learning rate/warm up based on the data size ?
Describe the bug This is not a bug, but trying to understand
To Reproduce Steps to reproduce the behavior:
If the training data size is larger inclduing LibriTTS clean, should we play with the parameters like : ${WARMUP_STEPS:=1000} : ${KL_LOSS_WARMUP:=100} : ${LEARNING_RATE:=0.1}
?
I am not sure, I think the warmup steps just want to find a better "draft" of model weights, there is no need to see every sample in the dataset in this process.
But what confused me is that, in the inferece.py, there is a warmup setup. "warmup iterations before measuring performance", Is that required? What would happen if we do not?