Addition of all Activation Functions.
Hi everyone, I have compiled a list of all activation functions that currently not implemented in mlpack but have can be found in either tensor flow or pytorch.
- ~~SELU~~
- CELU
- GELU (Currently taken up by @himanshupathak21061998 )
- Hard shrink
- Lisht ( I have currently taken up this issue)
- Soft shrink (Currently taken up by @ojhalakshya)
- ISRU (Inverse Square Root Unit)
- Inverse Square Root Linear.
- Square Non Linearity,
I might have missed some functions, feel free to add them to list. If any one would like to taken up the above functions, please feel free to do so. I hope this is okay with members of the organisation, This was done in order to reduce effort in finding unimplemented functions as well as bring all add State of art activation functions to mlpack. In case I missed something or added an activation that has already been implemented, please forgive me. Thanks.
Hi @kartikdutt18 thats a very good initiative to take. As I am currently working upon the soft shrink function and have almost completed the stuff there, is it ok if i take the Hard Shrink Function also?
Hi, @kartikdutt18 I'll work on the implementation of CELU. Is it okay to move forward with it?
Hi @kartikdutt18 thats a very good initiative to take. As I am currently working upon the soft shrink function and have almost completed the stuff there, is it ok if i take the Hard Shrink Function also?
Feel Free to do so.Thanks
Hi, @kartikdutt18 I'll work on the implementation of CELU. Is it okay to move forward with it?
Feel Free to do so. Thanks.
I'll be working on 7) ISRU functions.
I'll be working on 7) ISRU functions.
Great. Thanks.
Is ELU implemented? If not I'll work on it.
Hi @gaurav-singh1998, ELU function is implemented in src/mlpack/methods/ann/layer folder of mlpack.
@kartikdutt18 thanks for opening the issue, just added some tags, so perhaps we can remove the Good First Issue from the title.
Hi @zoq, I have removed them. Thanks.
I'll be picking up 9) ISRLU.
@kartikdutt18 Looking at the activation functions, SELU is already implemented:
https://github.com/mlpack/mlpack/blob/928aee24c75f4227b5c1702aeb0acdc9aaf486e0/src/mlpack/methods/ann/layer/elu.hpp#L263
@zoq, Sorry I missed that, I will remove it from the list.
I will work on Inverse Square Root Linear if no one is doing that
@kartikdutt18 since ISRLU has different values for different ranges, should I put if statements to handle the different ranges?
Hi @PranavReddyP16, Yes that should be fine. You can see softplus or soft sign or rectifier or identity function for an example. I think they all use if conditions. Thanks.
Hi, @kartikdutt18, I would like to work on the implementation of Inverse Square Root Linear. Can I work on it ?
Hi @codeboy5, I think @PranavReddyP16 is working on that. Thanks. ( I will add some other good first issues that will require help, if you want to contribute you can work on it or find some other interesting issue.) Thanks.
@kartikdutt18 Thanks please do that. Is any activation function still unoccupied ? I recently about swish activation function, has someone implemented that previously?
Hi @codeboy5, I don't think so. And yes swish has already been implemented. I think the above list contains all activation functions that haven't been implemented.
@kartikdutt18 Thanks a lot. Please add some more issues to work on
Will do, @ojhalakshya and I are currently compiling another list of functions that need to be added, hopefully it will be posted by today or tomorrow.
Hi @kartikdutt18 is it ok if I work on ISRLU activation function.
Hi @prince776, I think @PranavReddyP16 has implemented but there was some mix up in branches so there isn't a PR open. I think he could give you a better reply.
Hi, @PranavReddyP16 have you implemented/planning to implement ISRLU?
Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly
On Mon, Feb 17, 2020, 5:11 PM Prince Gupta [email protected] wrote:
Hi, @PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .
Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly … On Mon, Feb 17, 2020, 5:11 PM Prince Gupta @.***> wrote: Hi, @PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .
Ok, then I'll leave this to you. Good luck with it.
Thanks :)
On Mon, Feb 17, 2020, 5:16 PM Prince Gupta [email protected] wrote:
Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly … <#m_366435399760716453_> On Mon, Feb 17, 2020, 5:11 PM Prince Gupta @.***> wrote: Hi, @PranavReddyP16 https://github.com/PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#2181 https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .
Ok, then I'll leave this to you. Good luck with it.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZECXZP4RXPIK7733E3RDJ2KJA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6D7WQ#issuecomment-586956762, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZFM6ZPMA6HYRMAMUCDRDJ2KJANCNFSM4KQ5A7WQ .
Hi, @birm can I work on adding bipolar sigmoid activation function. I've seen it in a paper and some class slides on neural networks??
The benefit of an open source project such as this is that you don't have to ask permission! :) If you want to, just put in a PR and let us know when you'd like it to be reviewed.