pytorch-seq2seq icon indicating copy to clipboard operation
pytorch-seq2seq copied to clipboard

Teacher forcing per timestep?

Open ghost opened this issue 5 years ago • 1 comments

Hi,

I don't understand why the teacher forcing is being done per the whole sequence. The definition of the teacher forcing claims that at each timestep, a predicted or the ground truth token should be fed from the previous timestep. The implementation here, on the other hand, will first make a decision on whether generate the whole sequence with teacher forcing, and then continues decoding with teacher forcing set to True or False (for the whole sequence), which I believe is not correct.

I really appreciate the feedback on this issue, Thanks!

ghost avatar May 13 '20 20:05 ghost

Hi,

I don't understand why the teacher forcing is being done per the whole sequence. The definition of the teacher forcing claims that at each timestep, a predicted or the ground truth token should be fed from the previous timestep. The implementation here, on the other hand, will first make a decision on whether generate the whole sequence with teacher forcing, and then continues decoding with teacher forcing set to True or False (for the whole sequence), which I believe is not correct.

I really appreciate the feedback on this issue, Thanks!

Year, I am dealing with RNN and also found this problem and in pytorch example (https://github.com/pytorch/tutorials/blob/master/intermediate_source/seq2seq_translation_tutorial.py#L558). I think it is a mistake

Sushentsev avatar Jun 20 '21 19:06 Sushentsev