Dual-Path-Transformer-Network-PyTorch icon indicating copy to clipboard operation
Dual-Path-Transformer-Network-PyTorch copied to clipboard

Unofficial implementation of Dual-Path Transformer Network (DPTNet) for speech separation (Interspeech 2020)

Results 5 Dual-Path-Transformer-Network-PyTorch issues
Sort by recently updated
recently updated
newest added

![image](https://github.com/yoonsanghyu/Dual-Path-Transformer-Network-PyTorch/assets/61000862/d210c7a1-2df4-4150-a8bb-87686443537c) 目前我只用了4-5条数据调试代码,为啥训练阶段打印的loss为0,是没用完整数据集的原因吗?

Hi, thanks for your sharing code. According to your code, the numbers of parameters in separation layer in DPTNet is 2,800,769 (2.8M), and my hyperparamters show below: ``` N=64 #...

Your 'data.py' is a little different from the official code. I think your result is under L=2 actually. My validation loss under L=4 is only -19.004.