enchanted
enchanted copied to clipboard
Please add option to connnect standard OPENAI endpoint
Please dont limit this amazing app just for Ollama endpoint, just simply adding OpenAI endpoint you will allow 80% of app to use your app for local / offline server use. For example like in LM Studio: https://lmstudio.ai/docs/basics/server
I think it would also allow you to use distributed-llama in API mode (another local LLM solution, whose API mode is compatible with part of the OpenAI API).
I am using localai rather than ollama, and I second this request - it would be great to be able to connect to an openai compatible server.
Duplicates #25