SMRY
SMRY copied to clipboard
[feature] Support Ollama
Would it be possible to integrate with Ollama, in addition to OpenAI for those of us who would like a self-hosted solution?
I'm no JS dev, but looking at the code it seems the use of OpenAI is fairly localized, and the Ollama API is not overly complex either. You would need to add an extra config for what model is being used, though.
...also happy to chip in, if you have a Patreon or something
Seconding this but also I think the OpenAI API is somewhere for things like LiteLLM? Not sure