Supervised contrastive loss saturation from the beginning on custom data
Hi, Thank you for the great work.
I am trying to run supervised contrastive loss on my custom dataset of images. It seems that loss is stuck at 6.920 from the beginning and 30 epochs are already done. The batch size is 64 due to GPU memory constraints. Can you tell me why it might be the case? The data has around 17000 samples and 6 classes (somewhat imbalanced) and I am just on the training stage 1 for now but loss has begun and stuck with 6.920. Any help woule be appreciated.
Hi,I encountered the same problem, did you solve this problem? @zaid478
No, I am still facing this. Infact, I removed minority classes and tried to train on 3 classes (~3000 samples per each class) but the loss just reduced to 5.6 after 800 epochs. The embeddings learned are also very bad.
Any help is appreciated.
When training, please turn on the argument"--warm" or set a larger learning rate(--learning_rate 0.5), it seems to solve this problem, at least it works for me
When training, please turn on the argument"--warm" or set a larger learning rate(--learning_rate 0.5), it seems to solve this problem, at least it works for me
Hi.I use this repo for my project.I need accuracy for each epoch and validation step.Have you ever tried code for get this parameters?