Reproduce results
Hello, I am trying to reproduce the results of SOSNet. I have implemented the code with the architecture from github and training on Notredame I get 1.5 % FPR95 on liberty and yosemite instead of ~1%. So I wanted to ask you some questions about the hyperparameters used.
- For Adam optimizer, which performed best, did you use a learning rate scheduler or weight decay?
- How many data did you use to train SOSNet for the Phototour dataset and HPatches?
- Did you initialize the weights of the model in a certain way?
- What kind of data augmentation are you performing? Rotation and flipping?
Thank you
Hello, Would you like to share your codes reproduced by yourself? I hope that we can help each other.
Thank you
@valavanisleonidas Hi, 1) I used adam with no learning rate scheduler with lr=1e-2, momentum=0.9. 2)Training data is just the standard three splits of the UBC dataset, ie train on one and test on the other two. 3) I used nn.init.orthogonal to initialize the weights. 4, yes
@Ahandsomenaive Hi, I am planing to share my re-implemented codes of of sosnet together with a recent work in another repository (https://github.com/yuruntian/HyNet), since there was legal issues.
@valavanisleonidas Hi, 1) I used adam with no learning rate scheduler with lr=1e-2, momentum=0.9. 2)Training data is just the standard three splits of the UBC dataset, ie train on one and test on the other two. 3) I used nn.init.orthogonal to initialize the weights. 4, yes
Sorry If I wasn't clear enough. I understand that you used the three splits of UBC dataset. But the algorithm requires a certain number of triplets for every epoch. So how many triplets did you use to train the algorithm ?
@valavanisleonidas Hi, I didn't sample triplets directly, instead, I sampled batches of matching pairs and form triplets inside the batch. Each epoch has N batches, and within a batch, B points (two patches for each point t) are sampled such that NxB can cover all the 3D points in the training set. Then you can construct B triplets for each batch with hard mining.