segmenter icon indicating copy to clipboard operation
segmenter copied to clipboard

Adam Optimizer not working

Open zubairbaqai opened this issue 3 years ago • 1 comments

I have Exerpimented on this code on many things , i have also introduced custom scedulers, but what i am not able to understand is , why SGD is working perfectly fine , while Adam optimizer isnt , i tried changing learning rate to different rate, but none seem to even start decreasing the loss. I used both SGD and adam from Torch.optim . any suggestions or help would be appreciated

Thanks

zubairbaqai avatar Jun 27 '22 05:06 zubairbaqai

You can check the code related to https://arxiv.org/abs/2103.13413 , they trained with adam and give details in the paper. Adam is much more agressive - so low learning rates should be used for finetuning compared to SGD.

rstrudel avatar Jun 27 '22 14:06 rstrudel