SupContrast icon indicating copy to clipboard operation
SupContrast copied to clipboard

Loss saturates

Open tarunn2799 opened this issue 5 years ago • 1 comments

I tried training SimCLR on CIFAR10 with the exact config that you've provided in the docs, except a 128 batch size, and without SyncBN since I'm running it on Colab.

The loss saturates at 28.xxx about 25 epochs. I've tried adjusting the learning rate but no difference. Is this because of the smaller batch size? Can you please help me with this?

tarunn2799 avatar Oct 22 '20 22:10 tarunn2799

You can just let it run, and check the performance once the training is done. Typically the loss looks saturated but the the performance of linear evaluation will keep increasing.

Or you can add an online linear classifier to monitor the performance.

HobbitLong avatar Dec 15 '20 07:12 HobbitLong