multi-task-learning-example icon indicating copy to clipboard operation
multi-task-learning-example copied to clipboard

Log var can become negative and explode

Open snie2012 opened this issue 4 years ago • 1 comments

The loss function can optimize in a way that keep decreasing the log_var values, which I observe in my experiments. One simple solution is to do torch.abs(log_var). Any thoughts on how this might affect the formulation of the deductions?

snie2012 avatar Feb 05 '21 18:02 snie2012

I have encountered the same situation during my training, do you have any better solutions?

YangLeiSX avatar Mar 03 '22 08:03 YangLeiSX