Use cloudflared Tunnels to publish ollama service ports, with clients returning with no messages
The cloudflared Tunnels are used to publish the ollama service port, and the client uses the enchanted-llm app for the dialogue, with no messages returned. But without cloudflared Tunnels, everything works fine. Is ollama using streaming dialogue using websocket?
Did you use localhost/127.0.0.1 or did you exposed the 0.0.0.0 local ip of your machine in the OLLAMA_HOST env variable?
For me adding OLLAMA_HOST="0.0.0.0:PORT" setting a desired port, in my case 5000 worked fine.
I took a bit of time to notice that i was exposing only the localhost no to my machines ip and the redirection wasn't working.
In the cloudflare config for the tunnel i had to set the rule to HTTP. Now it works fine but i still need to query the api with HTTPS instead of HTTP.
What settings did you use in cloudflare? I'm encountering the 524 error.
As commented up there I just served it on 0.0.0.0 and using the PORT 5000
Use a normal, temporary solution after passing the litellm proxy.