pytorch-seq2seq icon indicating copy to clipboard operation
pytorch-seq2seq copied to clipboard

NLL & Perplexity Loss

Open lethienhoa opened this issue 7 years ago • 2 comments

Hi, It seems that Perplexity is normalized twice & norm_term of NLLLoss should be masked out as well.

lethienhoa avatar May 13 '18 15:05 lethienhoa

Is this issue still open? I checked the code and didn't see the problems mentioned. Is it fixed?

manujosephv avatar Dec 25 '19 05:12 manujosephv

@lethienhoa Yes, there need to be updated about NLLLoss norm term. But I am also confused why loss is not divided in terms of norm_term before doing loss.backward()? https://github.com/IBM/pytorch-seq2seq/blob/f146087a9a271e9b50f46561e090324764b081fb/seq2seq/trainer/supervised_trainer.py#L63

woaksths avatar Sep 07 '20 17:09 woaksths