descript-audio-codec icon indicating copy to clipboard operation
descript-audio-codec copied to clipboard

does commit_loss and codebook_loss always be equal?

Open wl3b10s opened this issue 2 years ago • 2 comments

when i try to retrain DAC model, i found that commit_loss and codebook_loss always be equal for each iteration.

is it correct? code location: quantize.py, class VectorQuantize commitment_loss = F.mse_loss(z_e, z_q.detach(), reduction="none").mean([1, 2]) codebook_loss = F.mse_loss(z_q, z_e.detach(), reduction="none").mean([1, 2]) print('commitment_loss/codebook_loss:', commitment_loss, codebook_loss) e.g.: commitment_loss/codebook_loss: tensor([1.6723, 2.0803, 1.9611, 1.8907], device='cuda:0', grad_fn=<MeanBackward1>) tensor([1.6723, 2.0803, 1.9611, 1.8907], device='cuda:0', grad_fn=<MeanBackward1>)

wl3b10s avatar Oct 12 '23 00:10 wl3b10s

Can anyone explain the difference and function of these two loss terms?

barneymaydance avatar Oct 19 '23 00:10 barneymaydance

yes, they always have the same value. but you can give them different loss weight to emphasize which is more important.

chenjiasheng avatar Dec 07 '23 16:12 chenjiasheng