Toy-Neural-Network-JS icon indicating copy to clipboard operation
Toy-Neural-Network-JS copied to clipboard

Multi Hidden Layer Support for nn.js

Open AR-234 opened this issue 7 years ago • 5 comments

Tested on mnist and xor with default settings and multi hidden layers. Examples doesn't need to be rewriten.

Does the same: NeuralNetwork(2, 4, 1) NeuralNetwork(2, [4], 1)

For more layers adding nodes to the hidden_nodes array

thx @xxMrPHDxx for helping to fix the serialize <3

AR-234 avatar Feb 11 '18 17:02 AR-234

This is really impressive, thank you! I think I'd like to implement this as a tutorial. . . it's similar to how I was thinking about encapsulating the "Layer" object. I'm trying to decide if I should merge now or wait until I cover it in a video?

shiffman avatar Feb 11 '18 17:02 shiffman

Thanks, it's my first GitHub journey, normally i don't release anything because i end up writing hacky command line applications for myself.

Updated it to the current branch, well you still got the 10.18 version somewhere here, i really would like to see your approach on that.

AR-234 avatar Feb 11 '18 18:02 AR-234

This is similar to how I have done it in the past.

I think each layer should keep its work values both before and after activation. This would allow two things: a wider variety of activation functions per #70, and code reuse from the forward pass when training. Duplicating the same code in both the predict and train methods makes them very easy to accidentally break.

Also, if you calculate all of the gradients first the weights can potentially be updated in parallel.

I think the advantages outweigh the increased memory use.

Versatilus avatar Feb 14 '18 06:02 Versatilus

@Versatilus yeah i know but because these changes are not implemented yet i'll wait for the master branch to update before i push any changes of that kind.. Or atleast i don't know how i can create a branch, with multiple merged from other branches.. Haven't used git that much..

but i like it, also an other problem is that the neuralnetwork is to tied to the supervised learning thing, i would like to refactor the training and predict things aswell

AR-234 avatar Feb 14 '18 15:02 AR-234

This is really nice, I've been playing a bit and have to say works good.

Now that we have multi hidden layer, any advice on how to implement dropout and LSTM ?

LordCaox avatar Jul 14 '18 14:07 LordCaox