question: which part of your code takes care of proxy_lr vs base_lr?
Hello, In your paper you tune proxy_lr also as a hyperparameter. I was unsure where in your code you define proxy_lr could you please guide me? https://github.com/euwern/proxynca_pp/blob/master/train.py
See lines 307 (base_lr) and 311 (proxy_lr). These values are set in the config file (e.g.: cub.json - lines 43 (proxy_lr) and line 46 (base_lr)).
thanks for your response.
Did you mean that line 311 is the base_lr since it has the word base in it?

and also, I am not sure where proxy_lr is used in the code. I found this but just said lr. Could you please point me to it?
It is used by the optimizer (train.py - line 290 ) "config['opt']['type']", which is defined in the config file (cub.json - line 33) "torch.optim.Adam". Do checkout the Pytorch documentation regarding on how to use an optimizer. The proxy layer has a different learning rate than the rest of the layers.
I hope it helps.