Arthur

Results 6 comments of Arthur

same to me, Removed http.php and standard_8.php - worked for me.

No success. I tried adagrad, and I tried 4 different datasets, including Russian Librespeech., and tried another methods from sota/2019: TDS, ResNet. In all cases I get `loss has NaN...

@tlikhomanenko , I can see that am.arch files for Librespeech and Librevox are so different, e.g [this ](https://github.com/flashlight/wav2letter/blob/master/recipes/sota/2019/am_arch/am_transformer_ctc.arch) and [this](https://github.com/flashlight/wav2letter/blob/master/recipes/sota/2019/am_arch/am_transformer_ctc_librivox.arch). Is there any guides how to choose values for layers...

I'm able to run training (using GPU), and don't have `Loss has NaN values` anymore, but my WER doesn't downgrade, keep remaining around 99-101%.

Sorry, logs are already lost, @tlikhomanenko, sorry. I'll continue my experiments later (I think one of the reasons may be that the dataset I used is not accurate, and need...

Same error while installing on CentOS 7 ("Failed compiling object src/pre_generated-zmq.nobj.o")