Shibo Zhang
Shibo Zhang
Hello Team, I tried using local model llava in pre-release branch, but unfortunately got this error. Did I miss anything here? Thanks 
@Mac0q . This is my model version: `{"name":"llava:latest","model":"llava:latest","modified_at":"2024-05-20T13:50:45.2323374-07:00","size":4733363377,"digest":"8dd30f6b0cb19f555f2c7a7ebda861449ea2cc76bf1f44e262931f45fc81d081","details":{"parent_model":"","format":"gguf","family":"llama","families":["llama","clip"],"parameter_size":"7B","quantization_level":"Q4_0"},"expires_at":"0001-01-01T00:00:00Z"}`
@oldlastman Has your issue got resolved? I met with connection error when using ollama: ``` warnings.warn( Traceback (most recent call last): File "C:\Users\zhanshib\AppData\Roaming\Python\Python312\site-packages\httpx\_transports\default.py", line 69, in map_httpcore_exceptions yield File "C:\Users\zhanshib\AppData\Roaming\Python\Python312\site-packages\httpx\_transports\default.py",...