self-supervised-pretraining icon indicating copy to clipboard operation
self-supervised-pretraining copied to clipboard

chexpert config file (BYOL)

Open Mehrdad-Noori opened this issue 4 years ago • 0 comments

Hi, thanks for the great work!

I have some questions about training BYOL on chexpert dataset.

  1. why the number of iterations is multiplied by 2 in the config file? I am asking this because based on the code we have another multiplication by 2 because of the 'update_interval' which is 2 for BYOL. For example, to train with 50k iterations, the total number of iterations for training will be 50k x 2 x 2 (update_interval) = 200k.

  2. what is the default number of GPUs to train BYOL? Is that 4 with a batch size of 128 (imgs_per_gpu=32)?

  3. why 'lr' is set to '4.8/16'. Is it related to the number of GPUs?

Mehrdad-Noori avatar Apr 02 '21 17:04 Mehrdad-Noori