contrib icon indicating copy to clipboard operation
contrib copied to clipboard

Loss become nan with Adam and AdamW as base optimizers

Open FunnyJingl opened this issue 5 years ago • 0 comments

Loss becomes nan after training for ~20 steps - loss value stabily decreases and becomes nan with Adam or AdamW optimizers. In case of simple SGD usage it works well.

FunnyJingl avatar Jul 31 '20 21:07 FunnyJingl