Sakari Poussa
Sakari Poussa
```bash root@0d038e7669f4/workspace # echo $PYTHONPATH /usr/local/lib/openvino/inference_engine/: root@0d038e7669f4/workspace # ie_serving Traceback (most recent call last): File "/usr/bin/ie_serving", line 6, in from ie_serving.main import main File "/usr/lib/python3.7/site-packages/ie_serving/main.py", line 22, in from ie_serving.models.model_builder...
When you try to run the OVMS on the container: ```bash docker run --rm -v /opt/models/:/opt/ml:ro -p 9001:9001 -p 8001:8001 clearlinux/stacks-dlrs-mkl:v0.4.0 /workspace/scripts/serve.sh ie_serving model --model_path /opt/ml/resnet_V1_50 --model_name resnet50 --port 9001...
Please update to the latest released OpenVINO toolkit. The clearlinux/stacks-dlrs-mkl:v0.4.0 is using the old version 2019_R1.1
The scripts in `./installer` could be enhanced / fixed with the following: - [x] On Mac, don’t update brew every time the script is run - [x] Add option to...
You can only monitory cpu and memory (maybe other) resources.requests. Can you add a feature to monitor the resources.limits as well?