gpl
gpl copied to clipboard
Training on multi gpu
Is there a way to train GPL model with multi gpu? If yes can that help for training with larger batches?
The easiest solution would be to go with nn.DataParallel, although it might not be the most efficient way. Definitely, this will help for training with larger batches. I will have a try and integrate this feature if possible in the near future.