Support for Loss Functions (Symmetric Loss function)
Summary
Introduce Symmetric Loss functions for the ML model.
We are ultimately looking to support loss functions that are not only in PyTorch but also provide a flexibility to the user to plug in there own custom loss function. This issue does not only point to adding more loss functions but also asks for a better way of providing that interface to the user to bring in their own custom loss function that integrates with the models that we have today.
Use Cases
Message from the maintainers:
If you wish to see this enhancement implemented please add a 👍 reaction to this issue! We often sort issues this way to know what to prioritize.
Can you please explain why we would need this? @s0nicboOm
Can you please explain why we would need this? @s0nicboOm
Symmetric Loss function is when we want to weight the positive as well as the negative outliers equally. Currently we use Absolute error and MSE.
Here there is an Opportunity of introducing more such similar loss functions.
Hey,I would like to try resolving this issue.
hi, If I have understood it right, you want to me add support for symmetric functions https://github.com/numaproj/numalogic/blob/main/numalogic/models/vae/base.py here in the _init_criterion method.
def _init_criterion(loss_fn: str) -> Callable:
if loss_fn == "huber":
return F.huber_loss
if loss_fn == "l1":
return F.l1_loss
if loss_fn == "mse":
return F.mse_loss
raise ValueError(f"Unsupported loss function provided: {loss_fn}")
like the sigmoid loss from torch @vigith @s0nicboOm
torch.nn.functional has 21 loss functions among which 6 are symmetric loss functions, 3 are already supported
- L1 Loss (l1_loss)
- Mean Squared Error Loss (mse_loss)
- Huber Loss (huber_loss)
Need to add the following
- Poisson Negative Log Likelihood Loss (poisson_nll_loss)
- Gaussian Negative Log Likelihood Loss (gaussian_nll_loss)
- Smooth L1 Loss (smooth_l1_loss)
torch.nn.functional has 21 loss functions among which 6 are symmetric loss functions, 3 are already supported
- L1 Loss (l1_loss)
- Mean Squared Error Loss (mse_loss)
- Huber Loss (huber_loss)
Need to add the following
- Poisson Negative Log Likelihood Loss (poisson_nll_loss)
- Gaussian Negative Log Likelihood Loss (gaussian_nll_loss)
- Smooth L1 Loss (smooth_l1_loss)
Hi @rum1887 !
We are ultimately looking to support loss functions that are not only in PyTorch but also provide a flexibility to the user to plug in there own custom loss function. This issue does not only point to adding more loss functions but also asks for a better way of providing that interface to the user to bring in their own custom loss function that integrates with the models that we have today.
Take an example of a custom loss function that user wants to use in. In this case we are limited but if-else structure and we are not able to provide the flexibility to user. So, we do need a support for custom loss function that involves making a new interface. Hope that makes sense. Also, it should be noted that loss_fn are used at two places today: 1) training and 2) calculating reconstruction/prediction.
Thanks for the question. Let me add it to the issue summary for better clarification.