n0thing233
n0thing233
@chrisella Did your use case single model or multi model ? i.e did you use sagemaker multi mode endpoint or not? I'm facing exactly same issue as @dean-cpi , what...
Just want to give an update so that it could help other people. I made it work by: 1. set model_store to "/" 2. preload_model false 3. be careful when...
it will be helpful to show how these triton images are created (the dockerfiles). not all ml models can utilize these images out of the box so people might need...
any update on this? I'm trying to achieve similar thing. it will be great to have this toolkit support multiple models with each model have its own inference code.
any plan to work on this in your 2022 roadmap?
nvm, it is working after I disabled vpn. hmm. curious where is this server hosted..
@siddvenk, are the dockerfile or image building process of LMI containers open-sourced? Appreciate if you could share. If not, I think I'll need to build an image on top of...