REST API - bool being interpreted as scalar in POST json
Bug Report
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS Big Sur 11.0.1
- TensorFlow Serving installed from (source or binary): docker image
- TensorFlow Serving version: tensorflow/serving:latest
Describe the problem
Attempting to POST to the REST API produces unexpected errors.
{
"error": "The second input must be a scalar, but it has shape [1]\n\t"
}
Based on https://github.com/tensorflow/serving/issues/1199, it seems like the boolean may be treated as a scalar. We have tried several approaches, including messing around with the json serialization (e.g. "false"), but to no avail. If we try sending the data payload without json.dumps(), then we get
{
"error": "JSON Parse error: Invalid value. at offset: 0"
}
So it seems like it forces us to use json.dumps(), but also bugs out when parsing boolean values?
Exact Steps to Reproduce
- run tensorflow serving with docker
- make POST request
image variable is an ndarray that conforms to the model's signature definition
data = {
'instances': [
{
"input_image": image.tolist(),
"is_train": False,
}
]
}
response = requests.post(
'http://localhost:8501/v1/models/model/versions/1:predict',
data=json.dumps(data),
)
Source code / logs
saved-model-cli output
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['input_image'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 256, 256, 3)
name: image:0
inputs['is_train'] tensor_info:
dtype: DT_BOOL
shape: unknown_rank
name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['output_heatmaps'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 64, 64, 40)
name: concat_6:0
outputs['output_logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 40)
name: concat_4:0
outputs['output_probs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 40)
name: concat_5:0
Method name is: tensorflow/serving/predict
Any guidance would be appreciated in addressing this issue. Thank you in advance.
@sam-huang-1223,
Can you please provide the code so that we can get your SignatureDef (Boolean), for us to reproduce the issue? Thanks!
Hi @rmothukuru, thanks for taking the time to respond. What code are you referring to? All the code for the API call is there.
- Code for generating the image -> it is just a regular image that has been passed through pre-processing and standardized to (256, 256, 3)
- Code for producing the SignatureDef output is
saved_model_cli show --dir ~/model/cpu/1 --tag_set serve --signature_def serving_default --all- this is the only SignatureDef defined in the model
@sam-huang-1223,
The code I am referring corresponds to Model building, defining Inputs and Outputs Signature, especially, Boolean Type Input Signature. Thanks!
@rmothukuru The code for defining Inputs and Outputs signatures is as follows:
from tensorflow.saved_model.utils import build_tensor_info
import tensorflow.saved_model as tfs
input_image = tf.placeholder(tf.float32, [None, 256, 256, 3])
is_train = tf.placeholder(tf.bool)
... # build the graph and train the model
inputs["input_image"] = build_tensor_info(input_image)
inputs["is_train"] = build_tensor_info(is_train)
... # similar definitions for outputs
sig_def = tfs.signature_def_utils.build_signature_def(
inputs=inputs,
outputs=outputs,
method_name="tensorflow/serving/predict"
)
saver = tf.train.Saver(set(tf.global_variables()))
builder = tfs.builder.SavedModelBuilder(export_dir)
serving_key = tfs.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY
serving_tag = tfs.tag_constants.SERVING
builder.add_meta_graph_and_variables(
sess, # session with the model graph
[serving_tag],
signature_def_map={serving_key: sig_def},
clear_devices=False,
saver=saver)
builder.save()
@rmothukuru @minglotus-6 any update on this issue? It is blocking us at the moment.
@rmothukuru @minglotus-6 for additional reference, the SavedModel in question was produced by an older tensorflow version - 1.13.1
ummm, i met the same issue on version 1.13.1.
any updates? I have same issue, how do I input a bool?
I am also having this issue. except my signature def is:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['is_training'] tensor_info:
dtype: DT_BOOL
shape: ()
name: Placeholder_2:0
inputs['model_input'] tensor_info:
dtype: DT_FLOAT
shape: (8, 4096, 9)
name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['model_output'] tensor_info:
dtype: DT_FLOAT
shape: (8, 4096, 13)
name: Softmax:0
Method name is: tensorflow/serving/predict
Predicting on the model in the session produces the correct output:
# for batch_idx in range(num_batches):
batch_idx = 0
start_idx = batch_idx * BATCH_SIZE
end_idx = (batch_idx+1) * BATCH_SIZE
cur_batch_size = end_idx - start_idx
feed_dict = {
ops['pointclouds_pl']: current_data[start_idx:end_idx, :, :],
ops['is_training_pl']: is_training
}
pred_val = sess.run(
[ops['pred_softmax']],
feed_dict=feed_dict
)
print(pred_val[0].shape)
(8, 4096, 13)
@sam-huang-1223,
DT_BOOL is supported in RESTful API. Please try true or false as shown in JSON Mapping and let us know if you face any issues. Thank you!
Closing this due to inactivity. Please take a look into the answers provided above, feel free to reopen and post your comments(if you still have queries on this). Thank you!