Hi, when I tried to connect ollama in version 0.9.3, I got the following issue:
litellm.APIConnectionError: Unable to parse ollama chunk - {'model': 'gpt-oss:20b', 'created_at': '2025-08-10T08:23:05.6826255Z', 'response': '', 'thinking': 'We', 'done': False} Traceback (most recent call last): File "/home/jintao/miniconda/envs/AZ093/lib/python3.12/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1659, in anext async for chunk in self.completion_stream: File "/home/jintao/miniconda/envs/AZ093/lib/python3.12/site-packages/litellm/llms/base_llm/base_model_iterator.py", line 128, in anext chunk = self._handle_string_chunk(str_line=str_line) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jintao/miniconda/envs/AZ093/lib/python3.12/site-packages/litellm/llms/ollama/completion/transformation.py", line 424, in _handle_string_chunk return self.chunk_parser(json.loads(str_line)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jintao/miniconda/envs/AZ093/lib/python3.12/site-packages/litellm/llms/ollama/completion/transformation.py", line 465, in chunk_parser raise e File "/home/jintao/miniconda/envs/AZ093/lib/python3.12/site-packages/litellm/llms/ollama/completion/transformation.py", line 463, in chunk_parser raise Exception(f"Unable to parse ollama chunk - {chunk}") Exception: Unable to parse ollama chunk - {'model': 'gpt-oss:20b', 'created_at': '2025-08-10T08:23:05.6826255Z', 'response': '', 'thinking': 'We', 'done': False}
Seems to be a litellm issue: https://github.com/BerriAI/litellm/issues/13333
Hey guys the litellm has fixed the issue it seems, could you check if it works with that new version and report back? https://github.com/BerriAI/litellm/pull/13375
Seems that the docker container needs updating with the latest LiteLLM.