yaringal

Results 15 comments of yaringal

many thanks for the quick reply! To make sure I understand the logic in this function - is the following interpretation correct? Whenever the IMU error is not negligible, the...

It should be - I helped with the implementation!

Keras has a new flag `Training` (I think) that you can pass in the construction of the layer - this should keep dropout enabled at test time as well. Otherwise,...

Hi @jasonbunk, thanks for the pull request. I can merge it if you want, but I do not intend to maintain this repo in the long term. It's mostly for...

Thanks for opening an issue. TF / Theano / Keras keep changing, and I don't have the resources to keep the repo up to date - it's mostly for demonstration...

> making K number of forward passes in Keras, with the Dropout layer being active in a different way each time, and getting a large set of alternative predictions of...

Re-opening for people to see the answers above

that's because we reparametrise `Wz` (with `z~Bern(p)^K`) as `Wz/(1-p)` for it to have mean `W`. Then `K.square(weight)` has an added term `1/(1-p)^2` which cancels out the `1-p`, giving `1/(1-p)`.

Normal dropout upscales the feature vector by 1/(1-p) after dropping out units. We do the same, substituting W'=W/(1-p) into the model and KL calculations. Y Edit (2019): see lines ```...

You can update N as the amount of data increases (have a look at [this](http://www.cs.ox.ac.uk/people/yarin.gal/website/publications.html#Gal2016Improving)). In effect this will push the dropout p towards 0 in the limit of data