simclr
simclr copied to clipboard
Is training on multiple GPUs possible?
In the environment setup it's stated that "Our code can also run on a single GPU. It does not support multi-GPUs, for reasons such as global BatchNorm and contrastive loss across cores.". Howerer, here the comment is clear that it's for multiple GPUs. These two seem to contradict one another.
Can the existing tf2 (pre)training code work on multiple GPUs?
It should in principle be able to train on multiple GPUs (using tf2 code with the mirrored strategy), but we might not have tested it.
It works on multiple gpus, thanks.