SupContrast
SupContrast copied to clipboard
Loss saturates
I tried training SimCLR on CIFAR10 with the exact config that you've provided in the docs, except a 128 batch size, and without SyncBN since I'm running it on Colab.
The loss saturates at 28.xxx about 25 epochs. I've tried adjusting the learning rate but no difference. Is this because of the smaller batch size? Can you please help me with this?
You can just let it run, and check the performance once the training is done. Typically the loss looks saturated but the the performance of linear evaluation will keep increasing.
Or you can add an online linear classifier to monitor the performance.