load model error
**What deployment mode you are use? ** Kuberentes.
**What KubeFATE and FATE version you are using? ** fate_version=v1.9.0 kubefate_version=v1.4.5
**What OS you are using for docker-compse or Kubernetes? Please also clear the version of OS. **
OS: centeros Version 7.8
Clear how to reproduce your problem. I use the fate_flow_client.py to training model and when I load my model
Clear the unexpected response.

How do I deploy the model?Is there any relevant document? Thanks!
Please follow this document https://github.com/FederatedAI/KubeFATE/tree/master/docker-deploy#deploy-model
@owlet42 When I train a hetero federation model, and I have bound the service, and the result returned during online reasoning is always "score": 0.0

When I train a homo federation model, there is no score when reasoning

Same question!!! It always gave me 0 for hetero prediction and nothing for homo prediction.
Solved by:
For people who always obtain 0 predicting score:
"need_run" should be set to false for evaluation module in the host side.