kaierlong
kaierlong
I build the so on docker container, and docker image is nvcr.io/nvidia/tritonserver:21.07-py3.
@Taka152 Do you have a plan to support triton inference server 2.x?
@neopro12 OK. Thank you for reply.
@neopro12 How about the development of supporting triton 2.X going? If you already finished, I will hava a try and test.
when I set NO_ICONV=1 and rebuild, I get the same error as above. any help?
@nuance1979 1. frist I `make cleanest` the srilm and `make clean` the srilm-python 2. then I add 'NO_ICONV=1' to the file 'common/Makefile.machine.macosx' and rebuild the srilm with `make NO_ICONV=1 World`...