Eben Olson
Eben Olson
@benanne I'm going to ahead and merge this and #13 if you don't mind. I can make a PR later to address f0k's comments if you don't have time.
If you're working on them further that's great, let's wait till you're done. I just didn't want them to be stuck in limbo too long, they seem quite good already....
Since pretrained models are nearly all we have in Recipes so far, I think it's a pretty good fit :) But maybe creating a new top level directory for models...
I was also wondering about the best way to do sampling [here](https://github.com/Lasagne/Recipes/pull/15). Another common thing seems to be beam search, not sure if that can be implemented easily in either...
Can you elaborate on the shortcomings of using a macro function to create modules? I don't really see the problem.
Ok, I think I see your point now... If a module/block consists of a group of modifications applied to a (perhaps heavily parametrized) Layer, it's not very clean to have...
`Conv2DDNNLayer` with stride 2 should be fine.
While I think it would be amazing if it was possible to compose an abitrary expression and get a `Layer` back, this seems like a pretty radical departure from the...
Possibly relevant thread on [r/machinelearning](https://www.reddit.com/r/MachineLearning/comments/462p41/pros_and_cons_of_keras_vs_lasagne_for_deep/) yesterday. I haven't looked at Keras' code myself in any detail, but some commenters felt that the dual backend made the source harder to understand....
I've been meaning to learn more about how Ops are implemented, so I'd be up for giving that a try if we want to go that way.