Shavit
Shavit
You need to implement it. See these lines for streaming and saving the video: https://github.com/infusion/jQuery-webcam/blob/master/src/jscam.as#L232 https://github.com/infusion/jQuery-webcam/blame/master/README.md#L38
You can use `bow.SetTransport` ([link](https://github.com/headzoo/surf/blob/master/browser/browser.go#L543)), look at my [pull request](https://github.com/headzoo/surf/pull/56), or this example https://github.com/shavit/surf/commit/0c66b3812d11ec568d753e0515c7fdc459791652#diff-fc939672748c63d9d3c671163467f8e5R556
No, I just saw that this is missing from RobertaProcessing.
@dadatawajue I just added a pull request with this feature https://github.com/headzoo/surf/pull/56
@zeozeozeo ~this is the default model, but you can use other models with `-m MODEL_FILE` instead. The server will not start if the path is incorrect.~
Sorry, I got this error from `server` not `nitro`. The `nitro` server will start without a model argument, just make sure to use the absolute path to the file in...
The remote server backends will need API key field, to be added as authorization header, and a selected model name from a separate list. Since the model ID is used...
They are similar but not the same: * In https://github.com/psugihara/FreeChat/pull/60/commits/543303a37c680f1d5434ff485019ed5f0e717114 the parameters are similar to llama.cpp and OpenAI, except they have `options` nested (see https://github.com/ollama/ollama/blob/main/docs/api.md and https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values). * OpenAI and...
Yes, I don't remember why I used the other endpoint.
There are few more changes to make, such as backend initialization to ensure it is not nil, and solve the conflicts. Currently the local version of llama.cpp doesn't work, but...