codel icon indicating copy to clipboard operation
codel copied to clipboard

Add local LLM support

Open semanser opened this issue 1 year ago • 6 comments

https://ollama.com/

semanser avatar Mar 25 '24 21:03 semanser

+1

KhaoDoesDev avatar Mar 26 '24 16:03 KhaoDoesDev

Did this get added?

Frankystew1 avatar Mar 27 '24 00:03 Frankystew1

Did it get added

Frankystew1 avatar Mar 27 '24 19:03 Frankystew1

Not yet

semanser avatar Mar 27 '24 19:03 semanser

Given the actual version is expected to use OPEN_AI_API_KEY and OPEN_AI_MODEL, you could expose also OPEN_AI_BASE_URL enviromental variable.

In this way, users could use any OpenAI compatible endpoint to run these models. This would open up compatiblity with backends such as Ollama, Text Generation WebUI, LM Studio, etc.

So, if a user want to use Ollama, they should just type the base_url: http://127.0.0.1:11434, with any string as API Key, and the chosen model.

Note that base_url parameter is supported natively by OpenAI python library, so it probably will not require any extra configuration down the road: https://github.com/openai/openai-python/blob/0470d1baa8ef1f64d0116f3f47683d5bf622cbef/src/openai/_base_client.py#L328

elsatch avatar Apr 01 '24 12:04 elsatch

you could expose also OPEN_AI_BASE_URL enviromental variable.

@elsatch this is pretty much what @nilbot is working on in #41

semanser avatar Apr 01 '24 12:04 semanser