shimen

Results 7 comments of shimen

Same problem here `/ib/external/home/ilyas/torchTest/install/bin/luajit: ...lyas/torchTest/install/share/lua/5.1/threads/threads.lua:179: [thread 2 endcallback] ...lyas/torchTest/install/share/lua/5.1/threads/threads.lua:183: [thread 1 callback] ...e/ilyas/torchTest/install/share/lua/5.1/nn/Container.lua:67: In 14 module of nn.Sequential: In 3 module of nn.DepthConcat: In 2 module of nn.Sequential: ...ome/ilyas/torchTest/install/share/lua/5.1/cudnn/init.lua:162: Error...

Any update on the labels for the CASIA-webface data set

@himansh1314 @irexyc @grimoire I had the same issue in mmdeploy 0.13. You are right this is wrong preprocessing conversion. This issue is with keep_ratio not converted correctly. This is a...

@RunningLeon @irexyc @grimoire Can you confirm there is a bug in 0.13. Please see my previous comment

I believe I have a similar problem. **This is the original message:** **This is after reconvertin with telegramify_markdown:** ``` text_markdownify = telegramify_markdown.markdownify( message.text, max_line_length=None, normalize_whitespace=False, latex_escape=True ) ``` Here is...

@zerollzeng I have the same issue: https://github.com/open-mmlab/mmdeploy/issues/2204 **[06/21/2023-12:58:13] [TRT] [E] 2: [weightConvertors.cpp::quantizeBiasCommon::337] Error Code 2: Internal Error (Assertion getter(i) != 0 failed. )** Here is the my full TRT log...