InfoGraph icon indicating copy to clipboard operation
InfoGraph copied to clipboard

About negative loss

Open zhikaili opened this issue 4 years ago • 2 comments

Hi,

I find that as the training goes (beyond 20 epochs), the loss will gradually become negative. May I ask if this is harmful to downstream tasks?

Thank you!

zhikaili avatar Mar 23 '21 11:03 zhikaili

me,too!I want to know the reason and whether it is harmful to downstream tasks.

carrotYQ avatar Oct 17 '21 07:10 carrotYQ

Hi,

I find that as the training goes (beyond 20 epochs), the loss will gradually become negative. May I ask if this is harmful to downstream tasks?

Thank you!

I suspect this may be due to the presence of structurally consistent data in the same batch, but I don't have time to verify this at the moment, so if anyone does, please let me know the results, thanks.

damengdameng avatar Aug 29 '22 10:08 damengdameng