numalogic icon indicating copy to clipboard operation
numalogic copied to clipboard

Support for Loss Functions (Symmetric Loss function)

Open s0nicboOm opened this issue 3 years ago • 6 comments

Summary

Introduce Symmetric Loss functions for the ML model.

We are ultimately looking to support loss functions that are not only in PyTorch but also provide a flexibility to the user to plug in there own custom loss function. This issue does not only point to adding more loss functions but also asks for a better way of providing that interface to the user to bring in their own custom loss function that integrates with the models that we have today.

Use Cases


Message from the maintainers:

If you wish to see this enhancement implemented please add a 👍 reaction to this issue! We often sort issues this way to know what to prioritize.

s0nicboOm avatar Nov 01 '22 20:11 s0nicboOm

Can you please explain why we would need this? @s0nicboOm

ab93 avatar Nov 02 '22 15:11 ab93

Can you please explain why we would need this? @s0nicboOm

Symmetric Loss function is when we want to weight the positive as well as the negative outliers equally. Currently we use Absolute error and MSE.

Here there is an Opportunity of introducing more such similar loss functions.

s0nicboOm avatar Nov 04 '22 07:11 s0nicboOm

Hey,I would like to try resolving this issue.

haripriyajk avatar Sep 21 '23 17:09 haripriyajk

hi, If I have understood it right, you want to me add support for symmetric functions https://github.com/numaproj/numalogic/blob/main/numalogic/models/vae/base.py here in the _init_criterion method.

def _init_criterion(loss_fn: str) -> Callable:
    if loss_fn == "huber":
        return F.huber_loss
    if loss_fn == "l1":
        return F.l1_loss
    if loss_fn == "mse":
        return F.mse_loss
    raise ValueError(f"Unsupported loss function provided: {loss_fn}")

like the sigmoid loss from torch @vigith @s0nicboOm

rum1887 avatar Oct 08 '23 10:10 rum1887

torch.nn.functional has 21 loss functions among which 6 are symmetric loss functions, 3 are already supported

  1. L1 Loss (l1_loss)
  2. Mean Squared Error Loss (mse_loss)
  3. Huber Loss (huber_loss)

Need to add the following

  1. Poisson Negative Log Likelihood Loss (poisson_nll_loss)
  2. Gaussian Negative Log Likelihood Loss (gaussian_nll_loss)
  3. Smooth L1 Loss (smooth_l1_loss)

rum1887 avatar Oct 08 '23 11:10 rum1887

torch.nn.functional has 21 loss functions among which 6 are symmetric loss functions, 3 are already supported

  1. L1 Loss (l1_loss)
  2. Mean Squared Error Loss (mse_loss)
  3. Huber Loss (huber_loss)

Need to add the following

  1. Poisson Negative Log Likelihood Loss (poisson_nll_loss)
  2. Gaussian Negative Log Likelihood Loss (gaussian_nll_loss)
  3. Smooth L1 Loss (smooth_l1_loss)

Hi @rum1887 !

We are ultimately looking to support loss functions that are not only in PyTorch but also provide a flexibility to the user to plug in there own custom loss function. This issue does not only point to adding more loss functions but also asks for a better way of providing that interface to the user to bring in their own custom loss function that integrates with the models that we have today.

Take an example of a custom loss function that user wants to use in. In this case we are limited but if-else structure and we are not able to provide the flexibility to user. So, we do need a support for custom loss function that involves making a new interface. Hope that makes sense. Also, it should be noted that loss_fn are used at two places today: 1) training and 2) calculating reconstruction/prediction.

Thanks for the question. Let me add it to the issue summary for better clarification.

s0nicboOm avatar Oct 12 '23 04:10 s0nicboOm