SoundNet-tensorflow
SoundNet-tensorflow copied to clipboard
question about kl loss
Thanks for your efforts! The kl loss is implemented as: tf.reduce_mean(-tf.nn.softmax_cross_entropy_with_logits(logits=dist_a, labels=dist_b)) I wonder whether there should be a negative indicator. The logits and labels definitions seem to be different from the first answer in http://stackoverflow.com/questions/41863814/kl-divergence-in-tensorflow. Did you use dist_a / dist_b as labels? I hope you could help me on this. Thanks!