Francisco Facioni

Results 25 issues of Francisco Facioni

[Bower](http://twitter.github.com/bower/) uses tags to include packages, please create one so this plugin can be used with that manager. Probably is a good idea to avoid the 1.0 folder too.

It needs some refactoring, but this is good enough already.

Trying to follow the documentation for when Magnum was externally built and installed and I can't get it to work. Magnum compiles just fine but then it doesn't copy all...

### Description When TBB_USE_EXCEPTIONS is disabled task_dispatcher fails to link. ``` ld.lld: error: undefined symbol: tbb::detail::r1::do_throw_noexcept(void (*)()) >>> referenced by task_dispatcher.h:358 (/tmp/apg/external/tbb/src/tbb/task_dispatcher.h:358) >>> arena.cpp.o:(tbb::detail::d1::task* tbb::detail::r1::task_dispatcher::local_wait_for_all(tbb::detail::d1::task*, tbb::detail::r1::outermost_worker_waiter&)) in archive ../../gnu_9.3_cxx17_64_debug/libtbb_debug.a >>>...

**Description** Using onnxruntime backend with tensorrt and engine cache via the `load_model` makes the tensorrt cache to be regenerated every time. **Triton Information** Triton container nvcr.io/nvidia/tritonserver:22.06-py3 **To Reproduce** config file...

investigating

**Description** When using dynamically loaded models via the load model API, the ensemble will not pick them up. ``` I0712 13:09:16.608657 1 model_repository_manager.cc:843] AsyncUnload() 'resize' I0712 13:09:16.608668 1 model_repository_manager.cc:1136] TriggerNextAction()...

bug
investigating

**Describe the solution you'd like** Be able to just use an onnx file in the tensorrt backend given that tensorrt has a onnx parser. It would build the engine on...

The adaptor gets moved around and the inner adaptor is no longer valid