KeepAugment_Pytorch
KeepAugment_Pytorch copied to clipboard
Is the operation "torch.max" differentiable?
Thank you for your contribution! There is a question for you: In keep_autoaugment.py, there are some codes:
"images_half.requires_grad = True
if self.early:
preds = model(images_half,True)
else:
preds = model(images_half)
**score, _ = torch.max(preds, 1)**
score.mean().backward()
slc_, _ = torch.max(torch.abs(images_half.grad), dim=1)"
Is the operation "torch.max" differentiable?In my project, it seems to stop the process of backward.
I would appreciate that if you could help me to handle this question!!!!!!!!!!!!!!!!!!