Dāvis Nicmanis
Dāvis Nicmanis
@rafaelvalle is it possible to improve the flowtron's attention? I noticed `use_cumm_attention` parameter, which enables `AttentionConditioningLayer`, but it's disabled by default. Could this potentially improve attention alignment? There are also...
@rafaelvalle Thank you for the answer. I am interested in trying out both GMM and DCA attention models with Flowtron. Although, I'm not exactly sure about the adaption part. Are...
@rafaelvalle I also noticed that Flowtron does not concatenate the attention context to the decoder input when passing it to the attention RNN like Tacotron 2 . Tacotron 2: ```...
@artificertxj1 did you manage to get the Graves attention working with Flowtron?
@LeoniusChen could you provide a snippet of how did you implement DCA in Nvidia Tacotron 2?
> I don't think there is an error. There is only a misleading comment `# select generator parameters ` I was thinking about the method descriptions, i.e., from `get_optimizer()` method:...
The same error appears with: gyp ERR! node -v v9.5.0 gyp ERR! node-gyp -v v3.6.2 While browsing for solutions, I read that realm (which is a dependency of this package)...
The `libED.so` file is compiled for Linux. You should recompile it if you want to use it on Windows.
> Hey, thanks for replying! > > I did take most of the code from that repo. I am trying to debug why my alignment curve looks like this: ...