pedrofrodenas
pedrofrodenas
I think I found a solution: GlobalAveragePooling has an option to keep dims so that you can avoid using lambdas, here is the edited convert_global_avg_pool function, ``` def convert_global_avg_pool(node, params,...
In my case a had a problem with the Clip layer, I realized that Cliping equals applying a Relu activation function so I replace in operation_layers.py in line 37 the...
Yes, Indeed `keras.src.engine.keras_tensor` was moved to `from keras.src.backend import KerasTensor` . I already tried this but another errors arrise. During model conversion because `keras.backend.placeholder` was also removed in the newer...
Hello, thanks AlexanderLutsenko for raising this issue, yes, I am facing the same issues with keras3
I was able to apply quantization but not in all the layers using: ``` import os os.environ["TF_USE_LEGACY_KERAS"] = "1" import tf_keras as keras import tensorflow as tf #import keras #from...
The problem is that you are using Keras3. To avoid this problem, download tf_keras library that match your tensorflow version. import tf_keras as keras and set up os.environ["TF_USE_LEGACY_KERAS"] = "1"
当我尝试实例化模型时,我遇到了相同的问题