Marvin
Marvin
is it still not fixed? I am using react 18.2.0 and react-dom 18.2.0 with this line in my package.json "proxy": "http://localhost:3000" and i am getting the same error as op....
No the slow loading is due to wsl. Now i am using ollama on native windows. The loading time has improved, but it is still not very fast. I would...
My fault. Now in postman i can succesfully call the models by using the ip and port. But still i am faceing the same problems with the webui. Tried different...
yes i am using `OLLAMA_ORIGINS=*` also tried with http Browser console logs once in a while: ` 401 (Unauthorized)`
omg now its working. Changed `http://192.168.178.23/api` to `http://192.168.178.23:11434/api` did it once before but there i hadnt set the env variables. Thanks a lot. I should have gone to bed earlier
Now its working with docker run but with compose i am getting no answer when asking a question to the model(loading forever) and in browser console i am getting: `Uncaught...
same for me. When context gets to high by appending the history with every request, i get an empty response from time to time like in 30% of responses. edit:...
Same. But it seems as if it is not caused by the length. I think some characters are causing this, but i am not sure.
After some days of idle i am still seeing that gpu ist not used anymore and in ollama container when entering `nvidia-smi` i am facing: `Failed to initialize NVML: Unknown...