Ti-Tai Wang
Ti-Tai Wang
I tested with ```bash python export.py -m facebook/bart-large-cnn ``` and get: ```bash pytorch inference ... params set: {'input_ids', 'return_dict', 'attention_mask', 'use_cache', 'head_mask', 'cross_attn_head_mask', 'past_key_values', 'output_attentions', 'decoder_head_mask', 'decoder_attention_mask', 'decoder_input_ids', 'labels', 'decoder_inputs_embeds',...
> Some functions are similar to T5 export script. Suggest to do some refactoring to consolidate Bart and T5 script later. tracked #13221
> Hello everyone, I have used the branch above for `facebook/bart-base` pytorch model to ONNX model. I have used same optional parameters given in the readme file. The summarization results...
8/11 Update: 1. Model has been exported successfully. 2. Model has mismatched performance between Pytorch and ONNX 3. Validated Encoder and Decoder parts of model, and the results match Pytorch...
https://github.com/microsoft/onnxruntime/pull/16752
From PyTorch, you can use [`register_custom_op_symbolic`](https://pytorch.org/docs/stable/onnx.html#custom-operators) to register your own op, but you need to specify which aten::op in TorchScript you would like to replace. And if this is not...
@borisfom The changes seem to breaks three CI tests. Any idea?
> > If the PR is approved, I think it makes sense that the other two RNN modes should also be modified in this PR? > > Yes RNN/GRU are...
@kit1980 I am not authorized to merge this, could you help import?
In favor of #127039