owl
owl copied to clipboard
Freeze neuron's weight in training
In constructing a neural network in Tensorflow, user can choose to initialise the weight value as a tf.varialble or tf.const. The latter means that the weight will not be updated during training. I find this property useful in some use cases, such as neural style transfer. I think it would be great to have such "freeze" mechanism in Owl's NN module.
@jzstark is this still relevant? or implemented already?
Sorry my bad, this function is still not implemented yet.