DAG4MIA icon indicating copy to clipboard operation
DAG4MIA copied to clipboard

The exception of consistency loss

Open lixiang007666 opened this issue 2 years ago • 1 comments

During the training process, the consistency loss is getting bigger and bigger, and the other losses are getting smaller and smaller. Is it normal? Is it what we expect?

            consistency_loss = 0
            consistency_weight = get_current_consistency_weight(iter_num//len(loader_train_s), max_epoch)
            consistency_dist = consistency_criterion(predout_t[train_params['labeled_bs']:], ema_output)    #(batch, 3, 256, 256)
            consistency_dist = torch.mean(consistency_dist)
            consistency_loss = consistency_dist * consistency_weight

🐱‍👤

lixiang007666 avatar Apr 12 '23 12:04 lixiang007666

I've encountered the same issue where the consistency loss increases while other losses decrease during training. Could anyone provide some insights or suggestions on this?

hkxxxxx avatar Jan 20 '24 11:01 hkxxxxx