TorchSharp icon indicating copy to clipboard operation
TorchSharp copied to clipboard

missing features for optimizing the performance of scriptmodule

Open lindadamama opened this issue 11 months ago • 1 comments

Image

lindadamama avatar Mar 16 '25 14:03 lindadamama

FYI, based on my understanding about PyTorch 2.5.1 + TorchSharp 0.105.0,

  • frozen torch.jit.ScriptModule is loadble through PyTorch torch.jit.script - PyTorch torch.jit.freeze - PyTorch torch.jit.save - TorchSharp torch.jit.load.
    • however, TorchSharp torch.jit.freeze is not implemented yet.
    • that means, TorchSharp can not freeze trained torch.jit.ScriptModule by itself.
    • Consider in cases of process like not-frozen torch.jit.ScriptModule - PyTorch torch.jit.save - TorchSharp torch.jit.load - TorchSharp training - TorchSharp inference.
  • PyTorch torch.jit.optimize_for_inference automatically call torch.jit.freeze if not already frozen.
    • see also https://github.com/pytorch/pytorch/blob/v2.5.1/torch/jit/_freeze.py#L223
      • "not already frozen" is guessed by missing training attribute.
  • as PyTorch torch.jit.optimize_for_inference explained, optimized torch.jit.ScriptModule can not be saved.
    • see also https://pytorch.org/docs/2.5/generated/torch.jit.optimize_for_inference.html#torch-jit-optimize-for-inference

      Accordingly, serialization is not implemented following invoking optimize_for_inference and is not guaranteed.

    • actually, PyTorch torch.jit.script - PyTorch torch.jit.optimize_for_inference - PyTorch torch.jit.save - PyTorch torch.jit.load doesn't work.
    • that is also true for PyTorch torch.jit.script - PyTorch torch.jit.optimize_for_inference - PyTorch torch.jit.save - TorchSharp torch.jit.load.
    • that means, TorchSharp can not load optimized torch.jit.ScriptModule due to PyTorch design, and has to optimize by itself.
    • however, TorchSharp torch.jit.optimize_for_inference is not implemented yet.

hiyuh avatar Apr 04 '25 01:04 hiyuh