EPCDepth icon indicating copy to clipboard operation
EPCDepth copied to clipboard

Multiple GPU training

Open UCASHurui opened this issue 2 years ago • 0 comments

Hi there, thanks for the excellent work! I am working on using a large backbone on your EPCDepth network which takes a lot of time to train. I am wondering if we can accelerate the training with multiple GPUs. I have tried using the torch.distributed but failed for some reason. Have you tried using multiple GPUs for training? I really appreciate any help you can provide.

UCASHurui avatar May 16 '23 02:05 UCASHurui