[Bug] [FRONTEND][ONNX] Error converting operator LayerNormalization: tvm.error.InternalError: Check failed: (n.defined()) is false: Found null pointer node while traversing AST.
Expected behavior
The onnx frontend should import the model correctly.
Actual behavior
For the following model, the onnx frontend cannot import it.
Error converting operator LayerNormalization, with inputs: [X, scale]
Traceback (most recent call last):
File "/home/carla/Documents/test/test.py", line 36, in <module>
main()
File "/home/carla/Documents/test/test.py", line 30, in main
tvm_model = from_onnx(onnx_model, keep_params_in_input=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3692, in from_onnx
return g.from_onnx(graph, opset)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3323, in from_onnx
self._construct_nodes(graph)
File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3503, in _construct_nodes
raise err
File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3500, in _construct_nodes
op = self.bb.normalize(op)
^^^^^^^^^^^^^^^^^^^^^
File "/home/carla/Documents/tvm/python/tvm/relax/block_builder.py", line 667, in normalize
return _ffi_api.BlockBuilderNormalize(self, expr) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "tvm/ffi/cython/./function.pxi", line 212, in tvm.ffi.core.Function.__call__
tvm.error.InternalError: Check failed: (n.defined()) is false: Found null pointer node while traversing AST. The previous pass may have generated invalid data.
[10:44:58] /home/carla/Documents/tvm/src/relax/ir/block_builder.cc:64: Warning: BlockBuilder destroyed with remaining blocks!
Environment
OS: Ubuntu 20.04 TVM: 0.21.dev0(bcb68b130)
Steps to reproduce
This bug can be reproduced by the following code with the model in the attachment. As shown in the code, the model can be executed by onnxruntime.
import sys
import numpy as np
import onnx
import onnxruntime
import tvm
from tvm import relax
from tvm.relax.frontend.onnx import from_onnx
import pickle
def main():
onnx_model = onnx.load("a454.onnx")
with open("inputs.pkl", "rb") as fp:
inputs = pickle.load(fp)
try:
ort_session = onnxruntime.InferenceSession(
onnx_model.SerializeToString(), providers=["CPUExecutionProvider"]
)
ort_output = ort_session.run([], inputs)
except Exception as e:
print(e)
sys.exit(1)
print(ort_output)
# Convert the onnx model into relax through the onnx importer.
tvm_model = from_onnx(onnx_model, keep_params_in_input=True)
if __name__ == "__main__":
main()
Triage
- needs-triage
cc @KJlaccHoeUM9l
I've just checked and saw that this error doesn't occur after some changes I made to the from_onnx() code. I'll double check and modify the code soon.
The fix has been merged into the main branch.
The fix has been merged into the main branch.
Thanks! I have confirmed that this bug doesn't occur in the latest version of TVM.