multi-task-learning-example-PyTorch icon indicating copy to clipboard operation
multi-task-learning-example-PyTorch copied to clipboard

Results 3 multi-task-learning-example-PyTorch issues
Sort by recently updated
recently updated
newest added

This isn't an issue but a doubt i would like to clarify. When i am using the homoscedastic loss for my area of research the loss values are in negatives...

Your implementation is a little bit different from the formula in the paper, for example, where the denominator has sigma squared and yours doesn't ![image](https://user-images.githubusercontent.com/65807337/143679807-ffe6c7f8-e621-422b-9692-1fe6052eff74.png)

How parallel in multi card?