To make my custom model accept base64 image when served with TF-Serving
I am able to host a custom-tensorflow model with tf-serving, but that model accepts image in matrix form [None, None, None,3]. But I want to make changes such that it can accept base64 strings as input. I have searched and tried many things but wasn't successful.
Can anyone help please.
Result of !saved_model_cli show --dir ./saved_model --all is shown below.
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['input_1'] tensor_info:
dtype: DT_FLOAT
shape: (-1, -1, -1, 3)
name: serving_default_input_1:0
The given SavedModel SignatureDef contains the following output(s):
outputs['output_1'] tensor_info:
dtype: DT_FLOAT
shape: (-1, -1, -1, 3)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict
WARNING: Logging before flag parsing goes to stderr.
W0621 20:38:50.213453 139757073844096 deprecation.py:506] From /usr/local/lib/python2.7/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1786: calling __init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
Defined Functions:
Function Name: '_default_save_signature'
Option #1
Callable with:
Argument #1
input_1: TensorSpec(shape=(None, None, None, 3), dtype=tf.float32, name=u'input_1')
@kunalchamoli , have you tried the approach suggested in #1570?
As mentioned[here] (https://github.com/tensorflow/serving/issues/1570#issuecomment-598615900) he was able to download the model that accepts base64 as input, but in my case model takes input as tensor
@nniuzft can you help please !!
Stuck in the same place, please provide a working example on how to save a tf2/keras model that accepts base64 as input. Thanks in advance!
Long time short - I was able to do it by myself. Here's a working example with TF 2.4.1 and TF Serving 2.4.1
def preprocess_input(base64_input_bytes):
def decode_bytes(img_bytes):
img = tf.image.decode_jpeg(img_bytes, channels=3)
img = tf.image.resize(img, MODEL_INPUT_SHAPE)
img = tf.image.convert_image_dtype(img, MODEL_INPUT_DTYPE)
return img
base64_input_bytes = tf.reshape(base64_input_bytes, (-1,))
return tf.map_fn(lambda img_bytes:
decode_bytes(img_bytes),
elems=base64_input_bytes,
fn_output_signature=MODEL_INPUT_DTYPE)
inputs = tf.keras.layers.Input(shape=(), dtype=tf.string, name='b64_input_bytes')
x = tf.keras.layers.Lambda(preprocess_input, name='decode_image_bytes')(inputs)
x = my_custom_model(x)
serving_model = tf.keras.Model(inputs, x)
tf.saved_model.save(serving_model, './my_serving_model')
As you can see, there's no need to explicitly decode image bytes from base64. TF Serving makes it for us. Also you shouldn't use urlsafe base64 encoding when sending a POST request. Hope this will help :)
P.S. If there's a better/more efficient way to do this, please leave a snippet here
@kunalchamoli, As mentioned here, TF serving automatically decodes image bytes from from base64 images. Please let us know if this helps. Thank you!
Closing this due to inactivity. Please take a look into the answers provided above, feel free to reopen and post your comments(if you still have queries on this). Thank you!