SAGPool icon indicating copy to clipboard operation
SAGPool copied to clipboard

why optimizer.zero_grad() after optimizer.step()?

Open littleWangyu opened this issue 4 years ago • 0 comments

in train epoch: why optimizer.zero_grad() after optimizer.step()? Does it matter? It's usually optimizer.step()--loss.backward()--optimizer.step()

littleWangyu avatar Aug 19 '21 09:08 littleWangyu