hls4ml icon indicating copy to clipboard operation
hls4ml copied to clipboard

Why am I getting this error : ´AttributeError: 'NoneType' object has no attribute 'shape'´ ?

Open AnouarITI opened this issue 4 years ago • 4 comments

Hello, I trained a ResNet model on Keras, then I followed the getting started a notebook to use hls4ml for research purposes. First, please check the config output using the following code:

config = hls4ml.utils.config_from_keras_model(model, granularity='model')
print_dict(config)
Model
  Precision:         ap_fixed<16,6>
  ReuseFactor:       1
  Strategy:          Latency
-----------------------------------
Interpreting Model
Topology:
Layer name: input_2, layer type: InputLayer, input shapes: [[None, 1024, 2]], output shape: [None, 1024, 2]
Layer name: reshape_1, layer type: Reshape, input shapes: [[None, 1024, 2]], output shape: [None, 1, 1024, 2]
Layer name: ReStk1_conv1, layer type: Conv2D, input shapes: [[None, 1, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: ReStk1_conv2, layer type: Conv2D, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: ReStk1_conv3, layer type: Conv2D, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: add_12, layer type: Merge, input shapes: [[None, 32, 1024, 2], [None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: activation_13, layer type: Activation, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: ReStk1_conv4, layer type: Conv2D, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: ReStk1_conv5, layer type: Conv2D, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: add_13, layer type: Merge, input shapes: [[None, 32, 1024, 2], [None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: activation_14, layer type: Activation, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 1024, 2]
Layer name: max_pooling2d_6, layer type: MaxPooling2D, input shapes: [[None, 32, 1024, 2]], output shape: [None, 32, 512, 1]
Layer name: ReStk2_conv1, layer type: Conv2D, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: ReStk2_conv2, layer type: Conv2D, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: ReStk2_conv3, layer type: Conv2D, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: add_14, layer type: Merge, input shapes: [[None, 32, 512, 1], [None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: activation_15, layer type: Activation, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: ReStk2_conv4, layer type: Conv2D, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: ReStk2_conv5, layer type: Conv2D, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: add_15, layer type: Merge, input shapes: [[None, 32, 512, 1], [None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: activation_16, layer type: Activation, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 512, 1]
Layer name: max_pooling2d_7, layer type: MaxPooling2D, input shapes: [[None, 32, 512, 1]], output shape: [None, 32, 256, 1]
Layer name: ReStk3_conv1, layer type: Conv2D, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: ReStk3_conv2, layer type: Conv2D, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: ReStk3_conv3, layer type: Conv2D, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: add_16, layer type: Merge, input shapes: [[None, 32, 256, 1], [None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: activation_17, layer type: Activation, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: ReStk3_conv4, layer type: Conv2D, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: ReStk3_conv5, layer type: Conv2D, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: add_17, layer type: Merge, input shapes: [[None, 32, 256, 1], [None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: activation_18, layer type: Activation, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 256, 1]
Layer name: max_pooling2d_8, layer type: MaxPooling2D, input shapes: [[None, 32, 256, 1]], output shape: [None, 32, 128, 1]
Layer name: ReStk4_conv1, layer type: Conv2D, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: ReStk4_conv2, layer type: Conv2D, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: ReStk4_conv3, layer type: Conv2D, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: add_18, layer type: Merge, input shapes: [[None, 32, 128, 1], [None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: activation_19, layer type: Activation, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: ReStk4_conv4, layer type: Conv2D, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: ReStk4_conv5, layer type: Conv2D, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: add_19, layer type: Merge, input shapes: [[None, 32, 128, 1], [None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: activation_20, layer type: Activation, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 128, 1]
Layer name: max_pooling2d_9, layer type: MaxPooling2D, input shapes: [[None, 32, 128, 1]], output shape: [None, 32, 64, 1]
Layer name: ReStk5_conv1, layer type: Conv2D, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: ReStk5_conv2, layer type: Conv2D, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: ReStk5_conv3, layer type: Conv2D, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: add_20, layer type: Merge, input shapes: [[None, 32, 64, 1], [None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: activation_21, layer type: Activation, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: ReStk5_conv4, layer type: Conv2D, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: ReStk5_conv5, layer type: Conv2D, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: add_21, layer type: Merge, input shapes: [[None, 32, 64, 1], [None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: activation_22, layer type: Activation, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 64, 1]
Layer name: max_pooling2d_10, layer type: MaxPooling2D, input shapes: [[None, 32, 64, 1]], output shape: [None, 32, 32, 1]
Layer name: ReStk6_conv1, layer type: Conv2D, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: ReStk6_conv2, layer type: Conv2D, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: ReStk6_conv3, layer type: Conv2D, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: add_22, layer type: Merge, input shapes: [[None, 32, 32, 1], [None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: activation_23, layer type: Activation, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: ReStk6_conv4, layer type: Conv2D, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: ReStk6_conv5, layer type: Conv2D, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: add_23, layer type: Merge, input shapes: [[None, 32, 32, 1], [None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: activation_24, layer type: Activation, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 32, 1]
Layer name: max_pooling2d_11, layer type: MaxPooling2D, input shapes: [[None, 32, 32, 1]], output shape: [None, 32, 16, 1]
Layer name: flatten_1, layer type: Reshape, input shapes: [[None, 32, 16, 1]], output shape: [None, 512]
Layer name: dense1, layer type: Dense, input shapes: [[None, 512]], output shape: [None, 128]
Layer name: dense2, layer type: Dense, input shapes: [[None, 128]], output shape: [None, 128]
Layer name: dense3, layer type: Dense, input shapes: [[None, 128]], output shape: [None, 7]
Layer name: activation_25, layer type: Softmax, input shapes: [[None, 7]], output shape: [None, 7]

However, When I use the convert_from_keras_model() as follows:

hls_model = hls4ml.converters.convert_from_keras_model(model,
                                                       hls_config=config,
                                                       output_dir='model_1/hls4ml_prj',
                                                       part='xczu7ev-ffvc1156-2-e')

I get the following errors:

Creating HLS model

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
/tmp/ipykernel_764/2595051309.py in <module>
     17 print_dict(config)
     18 print("-----------------------------------")
---> 19 hls_model = hls4ml.converters.convert_from_keras_model(model,
     20                                                        hls_config=config,
     21                                                        output_dir='model_1/hls4ml_prj',

/usr/local/lib/python3.8/dist-packages/hls4ml/converters/__init__.py in convert_from_keras_model(model, output_dir, project_name, input_data_tb, output_data_tb, backend, board, part, clock_period, io_type, hls_config, **kwargs)
    218     _check_hls_config(config, hls_config)
    219 
--> 220     return keras_to_hls(config)
    221 
    222 

/usr/local/lib/python3.8/dist-packages/hls4ml/converters/keras_to_hls.py in keras_to_hls(config)
    337 
    338     print('Creating HLS model')
--> 339     hls_model = HLSModel(config, reader, layer_list, input_layers, output_layers)
    340     return hls_model

/usr/local/lib/python3.8/dist-packages/hls4ml/model/hls_model.py in __init__(self, config, data_reader, layer_list, inputs, outputs)
    316         self._make_graph(layer_list)
    317 
--> 318         self._optimize_model(self.config.optimizers)
    319 
    320     def _make_graph(self, layer_list):

/usr/local/lib/python3.8/dist-packages/hls4ml/model/hls_model.py in _optimize_model(self, optimizers)
    334 
    335     def _optimize_model(self, optimizers):
--> 336         optimize_model(self, optimizers)
    337 
    338     def make_node(self, kind, name, attributes, inputs, outputs=None):

/usr/local/lib/python3.8/dist-packages/hls4ml/model/optimizer/optimizer.py in optimize_model(model, passes)
     36         for opt in optimizers:
     37             for node in model.graph.values():
---> 38                 if opt.match(node):
     39                     res = opt.transform(model, node)
     40                     if res:

/usr/local/lib/python3.8/dist-packages/hls4ml/model/optimizer/passes/repack_stream.py in match(self, node)
     95             inp1 = node.get_input_variable(node.inputs[0])
     96             inp2 = node.get_input_variable(node.inputs[1])
---> 97             return inp1.shape != inp2.shape
     98         else:
     99             return False

AttributeError: 'NoneType' object has no attribute 'shape'

Could someone explain to me what is the problem here and how to fix it? Thank you

AnouarITI avatar Dec 20 '21 09:12 AnouarITI

Hello @AnouarITI

What version of hls4ml are you using?

Could you send a script or gist to reproduce the error?

Some recent updates may help this, e.g. https://github.com/fastmachinelearning/hls4ml/pull/443 https://github.com/fastmachinelearning/hls4ml/pull/472

jmduarte avatar Dec 20 '21 16:12 jmduarte

Hello @jmduarte please find attached a notebook with the script to reproduce the error. Sorry, I can not share the dataset but at least it can give you more visibility of the error. script.txt

AnouarITI avatar Jan 09 '22 22:01 AnouarITI

I have the same error when I use "add layers" in my model https://github.com/fastmachinelearning/hls4ml/issues/578

SteCla0 avatar Jun 22 '22 10:06 SteCla0

@SteCla0 did you find a resolution?

wilfredkisku avatar Jul 05 '22 08:07 wilfredkisku

This should be resolved in the current main branch, but reopen if not.

jmduarte avatar Apr 09 '23 23:04 jmduarte