Dual-Path-Transformer-Network-PyTorch
Dual-Path-Transformer-Network-PyTorch copied to clipboard
Unofficial implementation of Dual-Path Transformer Network (DPTNet) for speech separation (Interspeech 2020)
训练问题
 目前我只用了4-5条数据调试代码,为啥训练阶段打印的loss为0,是没用完整数据集的原因吗?
Hi, thanks for your sharing code. According to your code, the numbers of parameters in separation layer in DPTNet is 2,800,769 (2.8M), and my hyperparamters show below: ``` N=64 #...
Your 'data.py' is a little different from the official code. I think your result is under L=2 actually. My validation loss under L=4 is only -19.004.