AutoPR icon indicating copy to clipboard operation
AutoPR copied to clipboard

Run non-OpenAI models

Open irgolic opened this issue 2 years ago • 3 comments

So far we've only used GPT-4 and GPT-3.5, the next step is to try it on models that are locally hosted.

I'm not sure exactly how to go about this; as this is a Github Action, does Github have GPUs in their runners? How do we properly write it to work with custom runners? Could we rent GPUs on something like vast.ai? Are there any grants available for free computational resources to run AutoPR on?

I'd love to run a custom Github runner with my own GPU, and run tests with it.

Essentially, these two methods need to use a completion_func decoupled from OpenAI's functions. https://github.com/irgolic/AutoPR/blob/main/autopr/services/rail_service.py#L53-L125

irgolic avatar Apr 01 '23 18:04 irgolic

this seems relevant here, but more so a separate issue: https://github.com/go-gitea/gitea/issues/13539 https://blog.gitea.io/2023/03/hacking-on-gitea-actions/ https://blog.gitea.io/2022/12/feature-preview-gitea-actions/ https://gitea.com/gitea/tea/actions

also: https://github.com/nektos/act (linked from https://gitea.com/gitea/act_runner)

ghost avatar Apr 03 '23 23:04 ghost

What I did was create an API that follows the OpenAI api standard, this way I only needed to set openai.api_base = 'my_url' and it worked fine. There is also the LocalAI project that follows the same pattern. https://github.com/go-skynet/LocalAI

guilhermeaddr avatar May 03 '23 06:05 guilhermeaddr

Hey, thanks for the suggestion!

I think I'd rather move away from using the openai library in the long run (except potentially in its own self-contained repo implementation). Last time I talked to @ShreyaR she pointed me to the mainfest library, which looks like a really clean provider-agnostic solution for specifying LLM models. I believe this would interface with our config yaml really well too.

irgolic avatar May 03 '23 08:05 irgolic