Celine
Celine
Does it make sense to clip the value at 0? Or clip the loss at 0?
> @begeekmyfriend @joinssmith @celinew1221 @yaringal Do you have any progress?I meet the same problem,and I donot think we can remove log_var,because it is a way for measuring uncertainty,if it is...
> @celinew1221 Thank you for your helping.Would you mind tell me how to implement "clip log_var at 0“。 I'd just use torch.clamp(log_var, min=0)
Well, that depends on the losses. It makes no sense to have negative loss in a cross entropy loss function tho. This is really a question for the original author....