Suhas

Results 17 comments of Suhas

I managed to load the model by `AutoModel.from_pretrained(pretrained_model_name_or_path=model_path)` and it works fine. Now when trying to export model to onnx it fails by, code line: https://github.com/neuralmagic/sparseml/blob/main/src/sparseml/transformers/export.py#L225 error : `ValueError: You...

If possible can you provide an example.

After adding, decoder_input_ids. I had following error `exporting model exceed maximum protobuf size of 2gb, please call torch.onnx.export with use_external_data_format=True`, I added `use_external_data_format = True`, in line https://github.com/neuralmagic/sparseml/blob/main/src/sparseml/pytorch/utils/exporter.py#L473. But still...

I tried to further looked and was able to convert the model. With following changes: https://github.com/neuralmagic/sparseml/blob/main/src/sparseml/pytorch/utils/exporter.py#L495 `Adding: onnx.save(onnx_model, file_path, save_as_external_data=True).` You could look into adding these in further release. but...

ok, i modified the class as (i.e., onnxruntime/onnxruntime/python/tools/transformers/gpt2_helper.py) ``` class MyGPT2LMHeadModel(GPT2LMHeadModel): """Here we wrap a class for Onnx model conversion for GPT2LMHeadModel with past state.""" def __init__(self, config): super().__init__(config) def...

yes, but i need to import into onnxruntime this loss behaviour as i mentioned above?? @tianleiwu do i need to edit function in gpt2_beamsearch_helper.py or gpt2_helper.py, i am bit confused...

If possible could @tianleiwu provide a code snippet??, regrading computing loss.

yes, but i tried with CPU. Still it fails

Yes on cpu too same issue

i found some more logs as, `onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: SystemError : 2`