MetricGAN
MetricGAN copied to clipboard
Batch Size = 1
Hi, I am curious about the batch size. It is always 1 for different implementations. How long does it require to train 600 epochs? Why not use a much larger batch? It takes lots of time to train.
hi,
Because different utterances have different lengths, to use Batch Size >1, input padding is required. We also tried to use Batch Size >1 in a Pytorch version, but the performance degrades (need more experiments to verify). Typically, the training can be done within 48 hrs.