inv1s10n
inv1s10n
Is there a solution to this problem? I am currently experiencing this issue in the latest version as well.
I found there is no introduction about vLLM in official docs. Whether vLLM is not supported well ?
a A100, linux. I'm not sure whether this is a problem with the model or the startup parameters. Or it need other plugins ?
This is a bad news. Is there a link to this issue? And is there a plan to address this issue?
Thank you, but I am not using ollama, and when I started the model with vLLM, I set the max-model-len to 40960
I tried the following methods, and they seem to be effective.
I recently have no issues with its use. My vllm version is 0.14.0rc1.