Add wide&deep model support for Tensorflow Serving?
When I tried to serve the model to tensorflow serving, but fount it is no support in the link
So I try to write the client and encounter a similar issue after deployed model to tf-serving .
Are there any plans to add wide&deep model support for tensorflow serving?
@0400H We have a TF serving client for Wide & Deep large dataset here: run_tf_serving_client.py.
It's used as part of our Kubernetes pipeline that does Wide & Deep large dataset FP32 training (which exports a saved_model.pb) and serves the model, but it should work outside of kubernetes as well if you already have a saved model that you're serving. Some documentation on running the client can be found here. Does this help?
@0400H We have a TF serving client for Wide & Deep large dataset here: run_tf_serving_client.py.
It's used as part of our Kubernetes pipeline that does Wide & Deep large dataset FP32 training (which exports a saved_model.pb) and serves the model, but it should work outside of kubernetes as well if you already have a saved model that you're serving. Some documentation on running the client can be found here. Does this help?
@dmsuehir
It has been done through the following link: wide_deep wide_deep_fp32_pretrained_model.tar.gz
It also refers to many ancient source codes from other websites, which is really an ugly implementation.
@0400H: do you still need help with this issue?