LearningToCompare_FSL
LearningToCompare_FSL copied to clipboard
PyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
self.train_labels = [labels[self.get_class(x)] for x in self.train_roots] KeyError: '..\\datas\\omniglot_28x28'
i hava no idea to solve the problem. can you help me? Thanks
I found that there is a data leakage in the testing which leads to an increase in the accuracy of the model. The model contains batch normalization and the batch...
change the image into rgb doesn't solve this problem
# English ```bash conda create -n py27 python=2.7 conda deactivate conda activate py27 ```  ```bash pip install https://download.pytorch.org/whl/cu80/torch-0.3.0.post4-cp27-cp27mu-linux_x86_64.whl pip install torchvision==0.2.1 pip install matplotlib scipy ```  ```bash git...
Does this work if the training is on miniimagenet or omniglot and test on customer dataset? I wonder how it "learn to compare" in this situation. Many implementation has use...
how should i train it on my own datasets?
I wanted to check if we can use contrastive loss here, I tried but facing some errors. can anyone confirm and help?
In calculating accuracy of test dataset: https://github.com/floodsung/LearningToCompare_FSL/blob/master/omniglot/omniglot_train_one_shot.py#L237 ``` sample_images,sample_labels = sample_dataloader.__iter__().next() test_images,test_labels = test_dataloader.__iter__().next() sample_features = feature_encoder(Variable(sample_images).cuda(GPU)) # 5x64 test_features = feature_encoder(Variable(test_images).cuda(GPU)) # 20x64 sample_features_ext = sample_features.unsqueeze(0).repeat(SAMPLE_NUM_PER_CLASS*CLASS_NUM,1,1,1,1) test_features_ext = test_features.unsqueeze(0).repeat(SAMPLE_NUM_PER_CLASS*CLASS_NUM,1,1,1,1)...
Well.Thanks for this code.I have found my Index"idx" out of the range ,the list "self.image roots[]" when i run this code every time.How could i solve this problem?