Ramana Sundararaman
Ramana Sundararaman
> Lasagne has changed a bit in the meantime such that the tensors should already be broadcastable Could you please point to the code that you say has changed ?...
I tested with the following [code ](http://dpaste.com/1VK4K8B) after rebasing branch with the current master to mimic my scenario. It looks like I misunderstood your comment. Here, both the tensors have...
> Ah, yes, if you create the input variable yourself instead of letting InputLayer create it, then it will not be made broadcastable. So, I should be getting the output...
Hi, > Just using a T.tensor4() should work as well This is what I have in my code and this doesn't seem to work. I'm not sure if it is...
I'll take up this. The first two(wrt to your yesterday's comment), we can probably have an `welcome contribution` tag ? I know a couple of people who might be interested...
IIRC, before the batch norm PR was merged into Theano, there is a Lasagne Layer. Can you explain a bit more about the numerical instability ? I think `fast_compile` might...
> I need fast_compile or none in order to get the correct place for the error, but because the Lasagne BatchNorm now gives NaNs Oh! is it possible to share...
> I wanted to avoid relying on Theano 0.9 before releasing Lasagne 0.2. If I am not wrong, Theano is getting ready for 0.10.0 release. Isn't it safe to rely...
So there are two tasks in this. One is to implement GANs in recipes, the other is to implement the `WeightNormLayer` right ? For including in Recipes, aren't the models...
A small confusion regarding inclusion in recipes, should the implementation be done like how it is done in the paper "Improved Technique for training GANs" ?