TorchSharp
TorchSharp copied to clipboard
missing features for optimizing the performance of scriptmodule
FYI, based on my understanding about PyTorch 2.5.1 + TorchSharp 0.105.0,
- frozen
torch.jit.ScriptModuleis loadble through PyTorchtorch.jit.script- PyTorchtorch.jit.freeze- PyTorchtorch.jit.save- TorchSharptorch.jit.load.- however, TorchSharp
torch.jit.freezeis not implemented yet. - that means, TorchSharp can not freeze trained
torch.jit.ScriptModuleby itself. - Consider in cases of process like not-frozen
torch.jit.ScriptModule- PyTorchtorch.jit.save- TorchSharptorch.jit.load- TorchSharp training - TorchSharp inference.
- however, TorchSharp
- PyTorch
torch.jit.optimize_for_inferenceautomatically calltorch.jit.freezeif not already frozen.- see also https://github.com/pytorch/pytorch/blob/v2.5.1/torch/jit/_freeze.py#L223
- "not already frozen" is guessed by missing
trainingattribute.
- "not already frozen" is guessed by missing
- see also https://github.com/pytorch/pytorch/blob/v2.5.1/torch/jit/_freeze.py#L223
- as PyTorch
torch.jit.optimize_for_inferenceexplained, optimizedtorch.jit.ScriptModulecan not be saved.- see also https://pytorch.org/docs/2.5/generated/torch.jit.optimize_for_inference.html#torch-jit-optimize-for-inference
Accordingly, serialization is not implemented following invoking
optimize_for_inferenceand is not guaranteed. - actually, PyTorch
torch.jit.script- PyTorchtorch.jit.optimize_for_inference- PyTorchtorch.jit.save- PyTorchtorch.jit.loaddoesn't work. - that is also true for PyTorch
torch.jit.script- PyTorchtorch.jit.optimize_for_inference- PyTorchtorch.jit.save- TorchSharptorch.jit.load. - that means, TorchSharp can not load optimized
torch.jit.ScriptModuledue to PyTorch design, and has to optimize by itself. - however, TorchSharp
torch.jit.optimize_for_inferenceis not implemented yet.
- see also https://pytorch.org/docs/2.5/generated/torch.jit.optimize_for_inference.html#torch-jit-optimize-for-inference