Help!RuntimeError: a leaf Variable that requires grad is being used in an in-place operation.
creating data loader...
creating model and diffusion...
training...
Traceback (most recent call last):
File "scripts/segmentation_train.py", line 118, in
Plz!Tell me where the problem lies? and why? Hope someone nice can help me~ Thanks!!!
Try add p = p + 0 in the sync_params function within dist_util.py as follows: def sync_params(params): """ Synchronize a sequence of tensors across ranks from rank 0. """ for p in params: with th.no_grad(): p = p + 0 dist.broadcast(p, 0)
#84
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
#84
Try add p = p + 0 in the sync_params function within dist_util.py as follows: def sync_params(params): """ Synchronize a sequence of tensors across ranks from rank 0. """ for p in params: with th.no_grad(): p = p + 0 dist.broadcast(p, 0)
THANKSSSSSS!!