serving icon indicating copy to clipboard operation
serving copied to clipboard

REST API - bool being interpreted as scalar in POST json

Open sam-huang-1223 opened this issue 5 years ago • 9 comments

Bug Report

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS Big Sur 11.0.1
  • TensorFlow Serving installed from (source or binary): docker image
  • TensorFlow Serving version: tensorflow/serving:latest

Describe the problem

Attempting to POST to the REST API produces unexpected errors.

{
    "error": "The second input must be a scalar, but it has shape [1]\n\t"
}

Based on https://github.com/tensorflow/serving/issues/1199, it seems like the boolean may be treated as a scalar. We have tried several approaches, including messing around with the json serialization (e.g. "false"), but to no avail. If we try sending the data payload without json.dumps(), then we get

{
    "error": "JSON Parse error: Invalid value. at offset: 0"
}

So it seems like it forces us to use json.dumps(), but also bugs out when parsing boolean values?

Exact Steps to Reproduce

  1. run tensorflow serving with docker
  2. make POST request

image variable is an ndarray that conforms to the model's signature definition

        data = {
                'instances': [
                    {
                        "input_image": image.tolist(),
                        "is_train": False,
                    }
                ]
            }
    
        response = requests.post(
            'http://localhost:8501/v1/models/model/versions/1:predict',
            data=json.dumps(data),
        )

Source code / logs

saved-model-cli output

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['input_image'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 256, 256, 3)
        name: image:0
    inputs['is_train'] tensor_info:
        dtype: DT_BOOL
        shape: unknown_rank
        name: Placeholder:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['output_heatmaps'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 64, 64, 40)
        name: concat_6:0
    outputs['output_logits'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 40)
        name: concat_4:0
    outputs['output_probs'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 40)
        name: concat_5:0
  Method name is: tensorflow/serving/predict

Any guidance would be appreciated in addressing this issue. Thank you in advance.

sam-huang-1223 avatar Dec 05 '20 01:12 sam-huang-1223

@sam-huang-1223, Can you please provide the code so that we can get your SignatureDef (Boolean), for us to reproduce the issue? Thanks!

rmothukuru avatar Dec 07 '20 13:12 rmothukuru

Hi @rmothukuru, thanks for taking the time to respond. What code are you referring to? All the code for the API call is there.

  • Code for generating the image -> it is just a regular image that has been passed through pre-processing and standardized to (256, 256, 3)
  • Code for producing the SignatureDef output is saved_model_cli show --dir ~/model/cpu/1 --tag_set serve --signature_def serving_default --all - this is the only SignatureDef defined in the model

sam-huang-1223 avatar Dec 07 '20 21:12 sam-huang-1223

@sam-huang-1223, The code I am referring corresponds to Model building, defining Inputs and Outputs Signature, especially, Boolean Type Input Signature. Thanks!

rmothukuru avatar Dec 08 '20 06:12 rmothukuru

@rmothukuru The code for defining Inputs and Outputs signatures is as follows:

from tensorflow.saved_model.utils import build_tensor_info
import tensorflow.saved_model as tfs


input_image = tf.placeholder(tf.float32, [None, 256, 256, 3])
is_train = tf.placeholder(tf.bool)

...  # build the graph and train the model

inputs["input_image"] = build_tensor_info(input_image)
inputs["is_train"] = build_tensor_info(is_train)
... # similar definitions for outputs

sig_def = tfs.signature_def_utils.build_signature_def(
                inputs=inputs,
                outputs=outputs,
                method_name="tensorflow/serving/predict"
                )

saver = tf.train.Saver(set(tf.global_variables()))
builder = tfs.builder.SavedModelBuilder(export_dir)

serving_key = tfs.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY
serving_tag = tfs.tag_constants.SERVING

builder.add_meta_graph_and_variables(
            sess,  # session with the model graph
            [serving_tag],
            signature_def_map={serving_key: sig_def},
            clear_devices=False,
            saver=saver)

builder.save()

suryatejadev avatar Dec 08 '20 18:12 suryatejadev

@rmothukuru @minglotus-6 any update on this issue? It is blocking us at the moment.

sam-huang-1223 avatar Dec 11 '20 16:12 sam-huang-1223

@rmothukuru @minglotus-6 for additional reference, the SavedModel in question was produced by an older tensorflow version - 1.13.1

sam-huang-1223 avatar Dec 16 '20 14:12 sam-huang-1223

ummm, i met the same issue on version 1.13.1.

icelighting avatar Jan 26 '21 09:01 icelighting

any updates? I have same issue, how do I input a bool?

SurferZergy avatar Mar 17 '21 00:03 SurferZergy

I am also having this issue. except my signature def is:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['is_training'] tensor_info:
        dtype: DT_BOOL
        shape: ()
        name: Placeholder_2:0
    inputs['model_input'] tensor_info:
        dtype: DT_FLOAT
        shape: (8, 4096, 9)
        name: Placeholder:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['model_output'] tensor_info:
        dtype: DT_FLOAT
        shape: (8, 4096, 13)
        name: Softmax:0
  Method name is: tensorflow/serving/predict

Predicting on the model in the session produces the correct output:

# for batch_idx in range(num_batches):
batch_idx = 0
start_idx = batch_idx * BATCH_SIZE
end_idx = (batch_idx+1) * BATCH_SIZE
cur_batch_size = end_idx - start_idx

feed_dict = {
    ops['pointclouds_pl']: current_data[start_idx:end_idx, :, :],
    ops['is_training_pl']: is_training
}

pred_val = sess.run(
    [ops['pred_softmax']],
    feed_dict=feed_dict
)
print(pred_val[0].shape)

(8, 4096, 13)

GdMacmillan avatar Mar 26 '21 17:03 GdMacmillan

@sam-huang-1223,

DT_BOOL is supported in RESTful API. Please try true or false as shown in JSON Mapping and let us know if you face any issues. Thank you!

singhniraj08 avatar Jan 31 '23 08:01 singhniraj08

Closing this due to inactivity. Please take a look into the answers provided above, feel free to reopen and post your comments(if you still have queries on this). Thank you!

singhniraj08 avatar Feb 17 '23 12:02 singhniraj08