enchanted icon indicating copy to clipboard operation
enchanted copied to clipboard

Please add option to connnect standard OPENAI endpoint

Open dxcore35 opened this issue 1 year ago • 3 comments

Please dont limit this amazing app just for Ollama endpoint, just simply adding OpenAI endpoint you will allow 80% of app to use your app for local / offline server use. For example like in LM Studio: https://lmstudio.ai/docs/basics/server

dxcore35 avatar Nov 03 '24 13:11 dxcore35

I think it would also allow you to use distributed-llama in API mode (another local LLM solution, whose API mode is compatible with part of the OpenAI API).

jkeegan avatar Dec 04 '24 01:12 jkeegan

I am using localai rather than ollama, and I second this request - it would be great to be able to connect to an openai compatible server.

wilcomir avatar Dec 21 '24 09:12 wilcomir

Duplicates #25

olegshulyakov avatar Mar 12 '25 15:03 olegshulyakov