DWBC
DWBC copied to clipboard
Why the +1, -1 are added to the corr_losses?
Hi, first of all, thank you very much for your fruitful research and code. I have a question while following your great jobs!
In the code, the DWBC loss function is implemented as follows:
corr_loss_e = -torch.sum(log_pi_e, 1) * (self.eta / (d_e_clip * (1.0 - d_e_clip)) + 1.0) corr_loss_o = -torch.sum(log_pi_o, 1) * (1.0 / (1.0 - d_o_clip) - 1.0)
I can not understand why those bolded +1.0 and -1.0 are added. Can you please explain more briefly?