RMNet
RMNet copied to clipboard
About the activation function
As described in paper, the RESERVING operation should convert the activation function of the input feature to PReLU with parameters as 1. But I cannot find the corresponding implementations in your codes, is it equivalent to use ReLU?