Fix torch.jit.ScriptModule.zero_grad.
TorchSharp 0.105.0 doesn't have torch.jit.ScriptModule.zero_grad and falls back into torch.nn.Module.zero_grad incorrectly, then terminates silently.
Most probably, because JITModule is not compatible to NNModule in LibTorchSharp.
And as reported in https://github.com/pytorch/pytorch/issues/27144, libtorch also doesn't have torch::jit::Module::zero_grad.
As a workaround, manually loop over the parameters and zero them out like optimizer does.
Note;
- intentionally, omit
RELEASENOTES.mdupdate ATM.- due to avoid multiple conflict&rebase annoyance while MR review.
- i'll update later, before merging this MR, if upstream prefers.
- i'm not sure whether
foreachloop ofScriptModule.zero_gradinsrc/TorchSharp/JIT/ScriptModule.csis actually needed.- this is just mimicking what
Module.zero_gradinsrc/TorchSharp/NN/Module.csdoes.
- this is just mimicking what
Hey @hiyuh this looks okay to me, can you do two things: merge the latest changes from main and add a line in the releasenotes (make it NuGet Version 0.105.2, altough we might change that), under an API Changes section specifying that you introduced this?
@alinpahontu2912
- rebased & updated
RELEASENOTES.mdas usual. - i dunno why only
Windows_x64_NetFX Release_Buildfailed.- most probably, b/c network failure in Azure DevOps pipeline?
- update: the failure is gone, i dunno why...