block.bootstrap.pytorch icon indicating copy to clipboard operation
block.bootstrap.pytorch copied to clipboard

GPU memory

Open dimvarsamis opened this issue 4 years ago • 0 comments

Hello and thanks for this very nice job. I want to ask if the code could be refactored and use Distributed Data Parallel instead of Data Parallel. Or if you have a tip for me about the implementation, so i can do it.

thanks

dimvarsamis avatar Jun 28 '21 08:06 dimvarsamis