data-formulator icon indicating copy to clipboard operation
data-formulator copied to clipboard

ollama api error

Open Mrship12138 opened this issue 11 months ago • 2 comments

Dear Sir: LiteLLM completion() model= deepseek-coder-v2:16b; provider = ollama 2025-02-18 17:29:13,281 - LiteLLM - INFO - LiteLLM completion() model= deepseek-coder-v2:16b; provider = ollama 2025-02-18 17:29:26,456 - httpx - INFO - HTTP Request: POST http://localhost:11434/api/generate "HTTP/1.1 500 Internal Server Error"

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new

Image

Mrship12138 avatar Feb 18 '25 09:02 Mrship12138

Hello! I'm able to run deepseek-coder-v2 16b with the following configuration:

Image

I loaded the 16 model with Ollama using ollama run deepseek-coder-v2, so I use deepseek-coder-v2 in the model configuration.

Chenglong-MS avatar Feb 18 '25 17:02 Chenglong-MS

Actually I also encounter this issue every now and then. I debugged a little bit, and found it is an issue with ollama:

Error: an error was encountered while running the model: unexpected EOF

And this looks like an OOM issue during inference. I think the solution is to either use a smaller model, or use GPU for inference.

Chenglong-MS avatar Feb 18 '25 18:02 Chenglong-MS