SupContrast
SupContrast copied to clipboard
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
Hey I am about to use the [SupConLoss](https://github.com/HobbitLong/SupContrast/blob/331aab5921c1def3395918c05b214320bef54815/losses.py#L11) for my specific application: As I am embedding some graphs and would like to make the embeddings of similar graphs to get...
In the paper it is described that L2 normalization is performed on the output of the encoder as well as the output of the projection head. However, in the code...
the code: mask.sum(1) has the number 0 so the loss is nan Is there any problem with my code? respect for your answer,thank you
Hi @HobbitLong. Thank you for the very great work. I have one question about your implementation of SupConLoss as I am comparing it with your paper. In the supplementary of...
Hi, I am trying to reproduce the results. May I get the hyperparameters for ImageNet experiment?
Hi, can you release the moco code you used for imagenet? Thanks
The purpose of contrast loss is to minimize the positive sample distance while maximizing the negative sample distance. However, I only find minimizing the distance of positive samples in this...
Hello, Thanks for this great work. I try to train this work with my own dataset. Because of image size(3x224x224), I used batch_size 256, and here is my training loss:...
I got confused with: ```python # compute logits anchor_dot_contrast = torch.div( torch.matmul(anchor_feature, contrast_feature.T), self.temperature) # for numerical stability logits_max, _ = torch.max(anchor_dot_contrast, dim=1, keepdim=True) logits = anchor_dot_contrast - logits_max.detach() ```...
Hi, appreciate of your great work! But i have some confuse about loss function. if we have a minibath data: A, A1,A2,A3,B,C,D,E. Ai belongs to one class, and B, C,...