RuntimeError: dim() called on undefined Tensor
The most recent PyTorch build requires the dim argument to be specified when using torch.nn.functional.softmax. I've specified the 0th dimension in both calls to torch.nn.functional.softmax within pytorch_model.py and get this error:
bash-4.3# python pytorch_run.py --nogpu --start
Converting data to one-hot representation
Data Loaded
Dim Training Data (11258, 1995)
Dim Test Data (7487, 1995)
Traceback (most recent call last):
File "pytorch_run.py", line 153, in <module>
train()
File "pytorch_run.py", line 93, in train
loss.backward() # backprop
File "/usr/lib/python2.7/site-packages/torch/autograd/variable.py", line 128, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
File "/usr/lib/python2.7/site-packages/torch/autograd/__init__.py", line 98, in backward
variables, grad_variables, retain_graph)
RuntimeError: dim() called on undefined Tensor
Any help would be appreciated. You may also want to update those two calls to softmax.
Regards, Luke
is there any resolution to this? facing the same problem. it would really help if there is some documentation on which version of pytorch the code does work with.
It was written before pytorch 1.0. Since then pytorch upgraded its tensor interface.
I am using torch 0.4.0, and its failing. Maybe 0.3.0? if you have working code, please provide that version.
I think it was for 0.1.10 or around that.
Yunqing
On 14 Jul 2018, at 10:08 AM, Bruce Ho [email protected] wrote:
I am using torch 0.4.0, and its failing. Maybe 0.3.0? if you have working code, please provide that version.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or mute the thread.
finally! 0.1.12 worked. Thank you very much.
I thought the softmax was done on dim=1. The 0th dimension is the batch dimension.