cant fix this issue
Issue
Traceback (most recent call last):
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\aider\coders\base_coder.py", line 863,
in send_new_user_message
yield from self.send(messages, functions=self.functions)
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\aider\coders\base_coder.py", line 1124,
in send
yield from self.show_send_output_stream(completion)
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\aider\coders\base_coder.py", line 1198,
in show_send_output_stream
for chunk in completion:
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\ollama.py", line 370, in
ollama_completion_stream
raise e
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\ollama.py", line 329, in
ollama_completion_stream
status_code=response.status_code, message=response.text
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\httpx_models.py", line 576, in text
content = self.content
File "C:\Users\ignac\AppData\Local\Programs\Python\Python310\lib\site-packages\httpx_models.py", line 570, in content
raise ResponseNotRead()
httpx.ResponseNotRead: Attempted to access streaming response content, without having called read().
Version and model info
model llama3.1:70b