TRAINED QNN MODEL WITH BREVITAS CAN NOT BE EXPORTED AS ONNX MODEL WITH FINN MANAGER
Hello,
I have trained my model as QNN with brevitas. Basically my input shape is:
torch.Size([1, 3, 1024])
I have exported the .pt extended file. As I try my model and generate a confusion matrix I was able to observe everything that I want. So I believe that there is no problem about the model.
On the other hand as I try to export the .onnx file to implement this brevitas trained model on FINN, I wrote the code given below:
from brevitas.export import FINNManager
FINNManager.export(my_model, input_shape=(1, 3, 1024), export_path='myfinnmodel.onnx')
But as I do that I get the error as:
torch.onnx.export(module, input_t, export_target, **kwargs)
TypeError: export() got an unexpected keyword argument 'enable_onnx_checker'
I do not think this is related with the version. But if you want me to be sure about the version, I can check these too.
If you can help me I will be really appreciated. Sincerely;
HI,
I had also problem with FINNManager.export. In my case it helped me this code fix described under this link: https://github.com/Xilinx/brevitas/pull/408/commits/8cfb2265abbe019061d762b88d8f4f08dbf0165a
Thanks @pawel-tumialis for answering to this issue! I am closing it due to inactivity, please feel free to reopen or create a new issue if the problem persists!