GeneralJing

Results 28 issues of GeneralJing

I am currently working on model conversion and deployment.When I get the onnx model, I want to use the onnx model for inference.I know that, I can use onnxruntime for...

question

Hello. Recently I saw a blog you wrote a long time ago,It is about the conversion between Keras model and TensorFlow serving model, and deployment.When I follow the blog, the...

大佬,能提供docker相关的环境吗?本地编译的时候,环境不一致出现了报错,折腾了下还是不行

``` File "examples/torchpruner/prune_by_class_bisenetv2.py", line 39, in model, context = torchpruner.set_cut(model, result) File "site-packages/torchpruner-0.0.1-py3.8.egg/torchpruner/model_pruner.py", line 71, in set_cut File "site-packages/torchpruner-0.0.1-py3.8.egg/torchpruner/module_pruner/pruners.py", line 188, in set_cut File "site-packages/torchpruner-0.0.1-py3.8.egg/torchpruner/module_pruner/pruners.py", line 60, in set_cut File...

Traceback (most recent call last): File "train.py", line 4, in from models.model_stages import BiSeNet File "/home/xxx/work/STDC-Seg/models/model_stages.py", line 12, in from modules.bn import InPlaceABNSync as BatchNorm2d File "/home/xxxx/work/STDC-Seg/modules/__init__.py", line 1, in...

I run the demo on iphone 6s, I counted the inferencing time of the model, it was about 120ms, Is this time correct? let startTime = CFAbsoluteTimeGetCurrent() try imageRequestHandler.perform(self.requests) let...

recently, i want to analysis the time of model inference. I can see the most time consuming node op in the layer details by some tools, but the node name...

question