qdtrack icon indicating copy to clipboard operation
qdtrack copied to clipboard

Confusion on the relationship among samples_per_GPU, nums of GPUs and LR

Open jkd2021 opened this issue 3 years ago • 1 comments

Hi, guys, I'm facing a question about the relationship between learning rate and nums of GPUs. Now I'm trying to train the network by following the given config with a single GPU, whose original setting is,

samples per GPU=2 nums of GPUs=8 (batch size = 2 * 8 = 16) learning rate=0.02".

Am I right that I should change it into

samples per GPU=8 (cannot be 16, due to OOM of my GPU) nums of GPUs=1 (batch size = 8 * 1 = 8) learning rate=0.01"?

Although it's really a naive question, If anyone could give me the answer, I'll still really appreciate that!

jkd2021 avatar Sep 05 '22 20:09 jkd2021

Yes, that could work.

OceanPang avatar Sep 06 '22 12:09 OceanPang