Guillaume Habault

Results 2 issues of Guillaume Habault

- There is an error in the posterior_variance definition that should be divided by (1 - alpha_) otherwise the (1 - alpha) are cancelling each other out. - torch.min, torch.max...

According to https://github.com/pytorch/pytorch/pull/97863 torch._six has been removed. I propose the following modification to avoid the error "module 'torch' has no attribute '_six'". This solution is also suggested in other projects:...