mlpack icon indicating copy to clipboard operation
mlpack copied to clipboard

Addition of all Activation Functions.

Open kartikdutt18 opened this issue 5 years ago • 88 comments

Hi everyone, I have compiled a list of all activation functions that currently not implemented in mlpack but have can be found in either tensor flow or pytorch.

  1. ~~SELU~~
  2. CELU
  3. GELU (Currently taken up by @himanshupathak21061998 )
  4. Hard shrink
  5. Lisht ( I have currently taken up this issue)
  6. Soft shrink (Currently taken up by @ojhalakshya)
  7. ISRU (Inverse Square Root Unit)
  8. Inverse Square Root Linear.
  9. Square Non Linearity,

I might have missed some functions, feel free to add them to list. If any one would like to taken up the above functions, please feel free to do so. I hope this is okay with members of the organisation, This was done in order to reduce effort in finding unimplemented functions as well as bring all add State of art activation functions to mlpack. In case I missed something or added an activation that has already been implemented, please forgive me. Thanks.

kartikdutt18 avatar Feb 06 '20 13:02 kartikdutt18

Hi @kartikdutt18 thats a very good initiative to take. As I am currently working upon the soft shrink function and have almost completed the stuff there, is it ok if i take the Hard Shrink Function also?

ojhalakshya avatar Feb 06 '20 13:02 ojhalakshya

Hi, @kartikdutt18 I'll work on the implementation of CELU. Is it okay to move forward with it?

gaurav-singh1998 avatar Feb 06 '20 13:02 gaurav-singh1998

Hi @kartikdutt18 thats a very good initiative to take. As I am currently working upon the soft shrink function and have almost completed the stuff there, is it ok if i take the Hard Shrink Function also?

Feel Free to do so.Thanks

kartikdutt18 avatar Feb 06 '20 14:02 kartikdutt18

Hi, @kartikdutt18 I'll work on the implementation of CELU. Is it okay to move forward with it?

Feel Free to do so. Thanks.

kartikdutt18 avatar Feb 06 '20 14:02 kartikdutt18

I'll be working on 7) ISRU functions.

prince776 avatar Feb 06 '20 17:02 prince776

I'll be working on 7) ISRU functions.

Great. Thanks.

kartikdutt18 avatar Feb 06 '20 17:02 kartikdutt18

Is ELU implemented? If not I'll work on it.

gaurav-singh1998 avatar Feb 06 '20 17:02 gaurav-singh1998

Hi @gaurav-singh1998, ELU function is implemented in src/mlpack/methods/ann/layer folder of mlpack.

kartikdutt18 avatar Feb 06 '20 17:02 kartikdutt18

@kartikdutt18 thanks for opening the issue, just added some tags, so perhaps we can remove the Good First Issue from the title.

zoq avatar Feb 06 '20 18:02 zoq

Hi @zoq, I have removed them. Thanks.

kartikdutt18 avatar Feb 06 '20 18:02 kartikdutt18

I'll be picking up 9) ISRLU.

vss96 avatar Feb 06 '20 18:02 vss96

@kartikdutt18 Looking at the activation functions, SELU is already implemented:

https://github.com/mlpack/mlpack/blob/928aee24c75f4227b5c1702aeb0acdc9aaf486e0/src/mlpack/methods/ann/layer/elu.hpp#L263

zoq avatar Feb 06 '20 18:02 zoq

@zoq, Sorry I missed that, I will remove it from the list.

kartikdutt18 avatar Feb 07 '20 02:02 kartikdutt18

I will work on Inverse Square Root Linear if no one is doing that

PranavReddyP16 avatar Feb 08 '20 07:02 PranavReddyP16

@kartikdutt18 since ISRLU has different values for different ranges, should I put if statements to handle the different ranges?

PranavReddyP16 avatar Feb 10 '20 06:02 PranavReddyP16

Hi @PranavReddyP16, Yes that should be fine. You can see softplus or soft sign or rectifier or identity function for an example. I think they all use if conditions. Thanks.

kartikdutt18 avatar Feb 10 '20 06:02 kartikdutt18

Hi, @kartikdutt18, I would like to work on the implementation of Inverse Square Root Linear. Can I work on it ?

codeboy5 avatar Feb 11 '20 03:02 codeboy5

Hi @codeboy5, I think @PranavReddyP16 is working on that. Thanks. ( I will add some other good first issues that will require help, if you want to contribute you can work on it or find some other interesting issue.) Thanks.

kartikdutt18 avatar Feb 11 '20 03:02 kartikdutt18

@kartikdutt18 Thanks please do that. Is any activation function still unoccupied ? I recently about swish activation function, has someone implemented that previously?

codeboy5 avatar Feb 11 '20 04:02 codeboy5

Hi @codeboy5, I don't think so. And yes swish has already been implemented. I think the above list contains all activation functions that haven't been implemented.

kartikdutt18 avatar Feb 11 '20 04:02 kartikdutt18

@kartikdutt18 Thanks a lot. Please add some more issues to work on

codeboy5 avatar Feb 11 '20 04:02 codeboy5

Will do, @ojhalakshya and I are currently compiling another list of functions that need to be added, hopefully it will be posted by today or tomorrow.

kartikdutt18 avatar Feb 11 '20 06:02 kartikdutt18

Hi @kartikdutt18 is it ok if I work on ISRLU activation function.

prince776 avatar Feb 17 '20 08:02 prince776

Hi @prince776, I think @PranavReddyP16 has implemented but there was some mix up in branches so there isn't a PR open. I think he could give you a better reply.

kartikdutt18 avatar Feb 17 '20 08:02 kartikdutt18

Hi, @PranavReddyP16 have you implemented/planning to implement ISRLU?

prince776 avatar Feb 17 '20 11:02 prince776

Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly

On Mon, Feb 17, 2020, 5:11 PM Prince Gupta [email protected] wrote:

Hi, @PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .

PranavReddyP16 avatar Feb 17 '20 11:02 PranavReddyP16

Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly On Mon, Feb 17, 2020, 5:11 PM Prince Gupta @.***> wrote: Hi, @PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .

Ok, then I'll leave this to you. Good luck with it.

prince776 avatar Feb 17 '20 11:02 prince776

Thanks :)

On Mon, Feb 17, 2020, 5:16 PM Prince Gupta [email protected] wrote:

Hey yeah I have implemented but need to fix something with my pr that I couldn't do because I've been busy with college work. I'll be able to do it today mostly … <#m_366435399760716453_> On Mon, Feb 17, 2020, 5:11 PM Prince Gupta @.***> wrote: Hi, @PranavReddyP16 https://github.com/PranavReddyP16 https://github.com/PranavReddyP16 have you implemented/planning to implement ISRLU? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#2181 https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZCQVTBJKMISK7I45BLRDJZYFA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6DOSY#issuecomment-586954571>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZCZ6BCSI7TPYR2XRP3RDJZYFANCNFSM4KQ5A7WQ .

Ok, then I'll leave this to you. Good luck with it.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mlpack/mlpack/issues/2181?email_source=notifications&email_token=AMFGPZECXZP4RXPIK7733E3RDJ2KJA5CNFSM4KQ5A7W2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEL6D7WQ#issuecomment-586956762, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMFGPZFM6ZPMA6HYRMAMUCDRDJ2KJANCNFSM4KQ5A7WQ .

PranavReddyP16 avatar Feb 17 '20 11:02 PranavReddyP16

Hi, @birm can I work on adding bipolar sigmoid activation function. I've seen it in a paper and some class slides on neural networks??

codeboy5 avatar Feb 18 '20 10:02 codeboy5

The benefit of an open source project such as this is that you don't have to ask permission! :) If you want to, just put in a PR and let us know when you'd like it to be reviewed.

birm avatar Feb 18 '20 15:02 birm