DropoutUncertaintyCaffeModels
DropoutUncertaintyCaffeModels copied to clipboard
Activations in convolutional layer
Hi!
First of all, thanks for open-sourcing the experiments.
I'm currently trying to reproduce the results using a different framework. I noticed that in the definition of the convolutional networks (so called "LeNet" in this repo), the convolutional layers' activation function is linear (there is no ReLU behind). Is this intended?
Thanks, Mateusz