How to specify the device
Hi, I am using the model for inference. I have several GPUs, and I would like to split the images for different GPUs to infer. How to move the restorer to different GPUs? restorer.to(device) does not work.
@Crestina2001 In PyTorch try using torch.nn.DataParallel or torch.nn.parallel.DistributedDataParallel .when using DataParallel, the model will be replicated across all available GPUs, and each GPU will process a portion of the input data Wrap the model with DataParallel: restorer = nn.DataParallel(restorer). Then concatenate the results
what about specifing a GPU other then 0? I am trying to get it to use gpu1 but my attempts have failed.