Error converting conv1d_small_keras.onnx example
Hello,
I tried to create HLS project using the ONNX 'conv1d_small_keras.onnx' available example model but stumbled in the following error when executing onnx_to_hls():
2021-10-21 16:00:35.368645: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
Downloading example model files ...
{'OutputDir': 'my-hls-test', 'ProjectName': 'myproject', 'XilinxPart': 'xcku115-flvb2104-2-i', 'ClockPeriod': 5, 'Backend': 'Vivado', 'IOType': 'io_parallel', 'HLSConfig': {'Model': {'Precision': 'ap_fixed<16,6>', 'ReuseFactor': '1'}}, 'OnnxModel': 'conv1d_small_keras.onnx'}
Input shape: [1, 10, 4]
Topology:
Traceback (most recent call last):
File "conv1d_small_keras_onnx.py", line 10, in <module>
hls_model = hls4ml.converters.onnx_to_hls(config)
File "/home/siorpaed/anaconda3/envs/stm.ai-env/lib/python3.7/site-packages/hls4ml/converters/onnx_to_hls.py", line 252, in onnx_to_hls
layer['out_width'] = int(math.ceil(float(layer['in_width']) / float(layer['stride'])))
KeyError: 'stride'
The Python script used is derived by the one described in the "Getting Started" section:
import hls4ml
#Fetch a keras model from our example repository
#This will download our example model to your working directory and return an example configuration file
config = hls4ml.utils.fetch_example_model('conv1d_small_keras.onnx')
print(config) #You can print the configuration to see some default parameters
#Convert it to a hls project
hls_model = hls4ml.converters.onnx_to_hls(config)
#Use Vivado HLS to synthesize the model
#This might take several minutes
hls_model.build()
#Print out the report if you want
hls4ml.report.read_vivado_report('my-hls-test')
The environment I am using should be ok as other examples, such as 'three_layer_bn_pytorch.onnx' and 'KERAS_3layer.json', are working fine instead. The hls4ml version is 0.5.0.
Thank you,
David S.
The onnx converter has gone through many changes since 0.5.0, but having said that, the newest version still fails on this model, though in a different way. We are actively updating the onnx parser, though, so we should have something better soon. If there is something that that hinders your work, let us know.
Hello, thank you very much for your support. So far, this is non blocking: we just started to play with this interesting tool and happened to try the ONNX example as we are mainly using ONNX as our reference models. We can experiment with Keras/qKeras meanwhile.
Thanks,
David
Hello,
We should probably update the example onnx model ... The new onnx converter is not really built to handle onnx models from keras (since the keras converter already handles that)!
Ideally, if you have any pytorch/onnx model of similar architecture, then I think the converter is supposed to work in this case.
Duc.