KuhnChris

Results 61 comments of KuhnChris

Hey there, I took the freedom of throwing this up on PyPi: https://pypi.org/project/mobi-python/#description

> Did you try playing around with these buffering options? https://caddyserver.com/docs/caddyfile/directives/reverse_proxy#streaming > > You might want to try setting the flush interval to -1. Yes, I played around with flush_interval,...

Hello @mholt - thanks for replying, I tried to fill in your template as good as possible, please let me know if I should investigate the images from docker hub...

You're welcome. thanks for taking a peek. I'm not really a Go programmer, so please excuse my lack of knowledge presented here, but I followed that error pretty much down...

hmm hmm, so would it be a viable option to create sort of a mini app that keeps sending data to a endpoint (uwsgi) with the standard go http lib...

OK, so I simplified all of this far more, so we have the following setup now: - Caddy server with file_server on port 8881 - Caddy server reverse proxy 8880...

Well, for low-latency streaming, that's the approach ffmpeg actually takes, so, we could change it in our example script, but I doubt we can get FFMPEG to change this. There...

No problem, I'll try to explain: Low Latency Streaming basically means we try to push the data to the server as fast as technically possible. The response isn't really anything...

Yes, that would exactly make sense. If there is no, or an negative, flush_interval it would make sense that there is no buffering and it flushes it directly through to...

> @kuhnchris Could you please try the commit pushed to branch `ctxcanceled`? https://github.com/caddyserver/caddy/tree/ctxcanceled - [3ffc391](https://github.com/caddyserver/caddy/commit/3ffc39190d01d528562299fd97c3a75d2b3c1794) > > I was able to get the error to go away using the send_app.py...