kutoga

Results 16 comments of kutoga

It is already published:)? I would be very interested in reading it.

I am currently traveling around and I do not have access to the resulted files with the training accuracies for each epoch. Unfortunately, the training for the MNIST really took...

This seems to be an issue with the used target/loss-function. Could you post the last layer of your model and the used loss-function? Do you use a binary-crossentropy? If you...

If you really just want to output a single number, then you could fix your network in this way: ``` model += GInftlyLayer( 'dfc1', w_regularizer=(c_l2, 1e-3), f_regularizer=(c_l2, f_reg), reweight_regularizer=False, f_layer=[...

I never used `sparse_categorical_crossentropy`, but I think it should just work with your data. The alternative is to use `to_categorical`. If you use `to_categorical`, you just can use the MNIST-example...

> to_categorical - so I'd get two weights, for yes/no? I'd rather get weights or distance than a 0/1 choice, so I can sort the matches to a photo. If...

Sorry for the late answer, but I was in the holidays. > Likely this is significant: "loss: nan - categorical_accuracy: 1.0000" > Changing to binary_accuracy, which gives e.g. "loss: 0.2511...

> Using that, here is the plot: > > http://phobrain.com/pr/home/gallery/w_5e-05_1e-08.png It doesn't look great, but at least the loss is going down. Do you use much data (& validation data)?...

I think the amount of pairs should be sufficient. Yes, if you define 1 as positive and 0 as negative label, then the network exactly does this. As you mentioned,...

Btw: The code for the symmetric KL-divergence is now online available: https://github.com/stdm/ZHAW_deep_voice/tree/master/networks/pairwise_kldiv