Cannot support DDP backend
Thanks for your great codes.
Environment info
- Python version: 3.8
- PyTorch version (GPU?): 1.6.0 Tesla V100
- Using GPU in script?: Yes
- Version : 0.3.0
Information
I am using Conformer with transducer ...):
Hope for supporting the ddp backend, which is much efficient than do.
I am also very interested in this feature. Do you guys think it is very hard to deal with?
@hasangchun We'll take a look over the weekend.
Can you add the pytorch-lightning ddp function? It won't be hard. @hasangchun
Thanks!
The issue for me is that supporting ddp for various data samplers, which is not quite easy for me. Directly changing hyperparameter for pytorch-lighting is direct but requires modifications on the data samplers.
How to support smartbatch/bucket sampler in a ddp mode?
I am not familiar with the distributed samplers thus seek for helps :) .
Any updates in this topic? I might try to take a look tomorrow otherwise
Sorry, I was busy 😭