ollama icon indicating copy to clipboard operation
ollama copied to clipboard

Use cloudflared Tunnels to publish ollama service ports, with clients returning with no messages

Open online2311 opened this issue 2 years ago • 4 comments

The cloudflared Tunnels are used to publish the ollama service port, and the client uses the enchanted-llm app for the dialogue, with no messages returned. But without cloudflared Tunnels, everything works fine. Is ollama using streaming dialogue using websocket?

online2311 avatar Feb 29 '24 08:02 online2311

Did you use localhost/127.0.0.1 or did you exposed the 0.0.0.0 local ip of your machine in the OLLAMA_HOST env variable?

For me adding OLLAMA_HOST="0.0.0.0:PORT" setting a desired port, in my case 5000 worked fine.

I took a bit of time to notice that i was exposing only the localhost no to my machines ip and the redirection wasn't working.

In the cloudflare config for the tunnel i had to set the rule to HTTP. Now it works fine but i still need to query the api with HTTPS instead of HTTP.

sanlega avatar Mar 31 '24 18:03 sanlega

What settings did you use in cloudflare? I'm encountering the 524 error.

thedavc avatar Apr 17 '24 07:04 thedavc

As commented up there I just served it on 0.0.0.0 and using the PORT 5000

sanlega avatar Apr 17 '24 07:04 sanlega

Use a normal, temporary solution after passing the litellm proxy.

online2311 avatar Apr 25 '24 02:04 online2311