BLIP
BLIP copied to clipboard
ITM Loss Stuck at 0.63
Hi, I am trying to replicate and pretrain BLIP for distillation purposes - I am using Flickr30K + COCO and my ITM loss gets stuck at 0.63 - upon an initial look, all of the ITM predictions are 1. Is this a dataset size issue or a batch issue? I've tried changing the learning rate to a smaller LR, I've tried increasing the size of the model and more, but nothing seems to work.
It seems to predict 0s for all of the ITM labels no matter what
so what is solution for this problem? i am encountering same problem