Traceback (most recent call last):
File "train_test.py", line 454, in
train()
File "train_test.py", line 327, in train
loss.backward()
File "/home/miao/anaconda3/lib/python3.6/site-packages/torch/tensor.py", line 93, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File "/home/miao/anaconda3/lib/python3.6/site-packages/torch/autograd/init.py", line 90, in backward
allow_unreachable=True) # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
when i run python train_test.py, i met this problem.How can i solve it.
The problem has been solved:
in layers/modules/l2norm.py
x /= norm
to:
x = x / norm
x = torch.div(x,norm) is also okay.