Zhenxi Zhou
Zhenxi Zhou
/kind bug Opening the issue based on conversation in Slack: https://kubeflow.slack.com/archives/CH6E58LNP/p1659980003450359 **What steps did you take and what happened:** When trying to create a gRPC InferenceService following https://kserve.github.io/website/0.8/modelserving/v1beta1/tensorflow/#create-the-grpc-inferenceservice guide, and...
### 📚 The doc issue https://github.com/pytorch/serve/tree/master/kubernetes/kserve#running-torchserve-inference-service-in-kserve-cluster Under this heading, the sentence > When we make an Inference Request, in Torchserve it makes use of port 8080, whereas on the KServe...
Please make sure that this is a bug. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template **Describe the...