Add SELU Activation Function Support to the Torch/Ivy Library
I would like to request the addition of support for the SELU (Scaled Exponential Linear Unit) activation function in the Torch/Ivy library. The SELU activation function is known for its ability to maintain certain properties, such as the mean and variance, during training, which can lead to improved performance in deep neural networks.
The requested function, selu(x, alpha=1.6732632423543772848170429916717, scale=1.0507009873554804934193349852946), applies the SELU activation element-wise to the input tensor x. It utilizes the ivy.where function to apply the appropriate calculations based on the given alpha and scale values.
Adding support for SELU activation in the Torch/Ivy library would provide users with a convenient and efficient way to incorporate this activation function into their deep learning models. This feature would greatly benefit researchers and practitioners working with Torch and Ivy, enabling them to take advantage of the SELU activation function's benefits for improved model performance.
If you are working on an open task, please edit the PR description to link to the issue you've created.
For more information, please check ToDo List Issues Guide.
Thank you :hugs:
Hey @program-20 ! Thanks for opening this PR! Is there an open ToDo list issue for the torch frontend that contains the
selufunction? If yes, could you please link it in the PR.
I just check and and selu already exists in the torch frontend here --> https://github.com/unifyai/ivy/blob/master/ivy/functional/frontends/torch/nn/functional/non_linear_activation_functions.py#L179