refact icon indicating copy to clipboard operation
refact copied to clipboard

Plugin in PyCharm and local model in Windows.

Open SrVill opened this issue 2 years ago • 3 comments

Is it possible to connect the plugin via the oobabooga-webui or koboldcpp API to a locally running model (Refact-1.6b, starcoder, etc.)? If possible, how? Or is it possible to work with local models only as described here?

SrVill avatar Sep 07 '23 11:09 SrVill

Yes we want CPU support, and a small inference server code without much dependencies would be great. The current work is in #77

olegklimov avatar Sep 07 '23 17:09 olegklimov

I had something else in mind. It does not matter on what to run the model locally (GPU or CPU), it is important that the plugin can work with a local model running not only in a docker container in WSL, because - and why do this when there is already oobabooga, where we can locally run models in a variety of formats. Refact is also launched in oobabooga, but it's not clear how to connect the plugin to it via the API.

SrVill avatar Sep 09 '23 15:09 SrVill

We'll actually solve this! New plugins with a rust binary will use standard API. (HF or OpenAI style)

olegklimov avatar Oct 03 '23 11:10 olegklimov