Issue with loading quantization aware trained model
Describe the bug Unable to load the saved model after applying quantization aware training.
System information
TensorFlow version (installed from source or binary): 2.2 TensorFlow Model Optimization version (installed from source or binary): 0.3.0
Code to reproduce the issue Please find the gist of the code here https://gist.github.com/peri044/00a477b73d01bd08ef3410c15679a91c#file-sample-py-L47
Error occurs at tf.keras.models.load_model() function. If I replace this with tf.saved_model.load(), I see the same error too. Any suggestions are appreciated. Thank you !!
Error :
model = tf_load.load_internal(path, loader_cls=KerasObjectLoader) File "/home/dperi/Downloads/py3/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py", line 604, in load_internal export_dir) File "/home/dperi/Downloads/py3/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py", line 134, in _load_all self._load_nodes() File "/home/dperi/Downloads/py3/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py", line 264, in _load_nodes node, setter = self._recreate(proto, node_id) packages/tensorflow/python/saved_model/load.py", line 398, in _recreate_function proto, self._concrete_functions), setattr File "/home/dperi/Downloads/py3/lib/python3.6/site-packages/tensorflow/python/saved_model/function_deserialization.py", line 265, in recreate_function concrete_function_objects.append(concrete_functions[concrete_function_name]) KeyError: '__inference_conv2d_layer_call_and_return_conditional_losses_5068'
Hello @nutsiepully , can you please provide any suggestions on this issue ? The code in the gist is mnist QAT example in the docs. Thank you
@peri044 Can you please try with below changes
with tfmot.quantization.keras.quantize_scope():
model = tf.keras.models.load_model('saved_model')
@peri044 - The code is working for me. Can you please try using tf-nightly and try the code again? Also, consider using @joyalbin's snippet as well, though for saved_model, it should just work.
Closing this for now since I was able to run the code without any issues, and it's likely a versioning issue.
Please feel free to reopen otherwise.
Thanks @nutsiepully. I faced this issue with TF2.2 version but now it works with 2.3rc2 version of TF.
Thanks @nutsiepully. I faced this issue with TF2.2 version but now it works with 2.3rc2 version of TF.
Are you sure? env: TF2.3rc2, it doesn't works .
@Wangyf46 Yeah. It works for me on 2.3.0-rc2
Is there any workaround for tf 2.2 for this? I am using conda to install tensorflow and currently only version 2.2 is available for linux.
This issue is happening for me on tf 2.4.1:
`with tfmot.quantization.keras.quantize_scope(): model = tf.keras.models.load_model('saved_model')
KeyError Traceback (most recent call last)
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/keras/saving/save.py in load_model(filepath, custom_objects, compile, options) 210 if isinstance(filepath, six.string_types): 211 loader_impl.parse_saved_model(filepath) --> 212 return saved_model_load.load(filepath, compile, options) 213 214 raise IOError(
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/keras/saving/saved_model/load.py in load(path, compile, options) 142 for node_id, loaded_node in keras_loader.loaded_nodes.items(): 143 nodes_to_load[keras_loader.get_path(node_id)] = loaded_node --> 144 loaded = tf_load.load_partial(path, nodes_to_load, options=options) 145 146 # Finalize the loaded layers and remove the extra tracked dependencies.
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in load_partial(export_dir, filters, tags, options) 763 A dictionary mapping node paths from the filter to loaded objects. 764 """ --> 765 return load_internal(export_dir, tags, options, filters=filters) 766 767
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in load_internal(export_dir, tags, options, loader_cls, filters) 888 try: 889 loader = loader_cls(object_graph_proto, saved_model_proto, export_dir, --> 890 ckpt_options, filters) 891 except errors.NotFoundError as err: 892 raise FileNotFoundError(
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in init(self, object_graph_proto, saved_model_proto, export_dir, ckpt_options, filters) 158 self._concrete_functions[name] = _WrapperFunction(concrete_function) 159 --> 160 self._load_all() 161 self._restore_checkpoint() 162
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in _load_all(self) 254 def _load_all(self): 255 """Loads all nodes and functions from the SavedModel and their edges.""" --> 256 self._load_nodes() 257 self._load_edges() 258 # TODO(b/124045874): There are limitations with functions whose captures
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in _load_nodes(self) 432 # interface. 433 continue --> 434 node, setter = self._recreate(proto, node_id) 435 nodes[node_id] = node 436 node_setters[node_id] = setter
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in _recreate(self, proto, node_id) 550 if kind not in factory: 551 raise ValueError("Unknown SavedObject type: %r" % kind) --> 552 return factorykind 553 554 def _recreate_user_object(self, proto, node_id):
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/load.py in _recreate_function(self, proto) 578 def _recreate_function(self, proto): 579 return function_deserialization.recreate_function( --> 580 proto, self._concrete_functions), setattr 581 582 def _recreate_bare_concrete_function(self, proto):
~/anaconda3/envs/tf2.4.1/lib/python3.6/site-packages/tensorflow/python/saved_model/function_deserialization.py in recreate_function(saved_function, concrete_functions) 275 concrete_function_objects = [] 276 for concrete_function_name in saved_function.concrete_functions: --> 277 concrete_function_objects.append(concrete_functions[concrete_function_name]) 278 279 for cf in concrete_function_objects:
KeyError: '__inference_expanded_conv_depthwise_layer_call_fn_79246'`
Thanks @thecosta. We are investigating this.
I have the same issue on TF 2.3.2. I got the error: KeyError: '__inference_expanded_conv_depthwise_layer_call_fn_31750316'
Get same issue on TF 2.2.0, 2.7.0.
@Janus-Shiau @nutsiepully I've got the same issue on TF 2.7.0. but the solution proposed by @joyalbin works for me just fine.
See two lines below.
with tfmot.quantization.keras.quantize_scope():
model = tf.keras.models.load_model('saved_model')