cua
cua copied to clipboard
How to use "Computer-Use Web" with LM-Studio models with docker
The documentation is not clear on how to use models from other LLM providers, nor on how to use "Computer-Use Web" to use models from other providers and connect to Docker (cua-ubuntu:latest)
@rodrigoandrigo are you following this quickstart? https://docs.cua.ai/docs/quickstart-devs
Yes, but it doesn't work with local models with lm-studio and I didn't find this interface in the documentation.
Try prompting Sonnet 4.5, It gave code to interface with ollama models. I suggest use a smaller model like granite3. But I was not happy with the delay of chat response and agent action.