amphetamarina
amphetamarina
Anyone working on this? Any progress?
This is becoming necessary for our company. I could take this issue, write the code for it and open a PR. Would just need to check contributing guidelines.
Isn't ollama OpenAI compatible? Just out of curiosity, is litellm proxy really necessary here? [ollama/docs/openai.md at main · ollama/ollama · GitHub](https://github.com/ollama/ollama/blob/main/docs/openai.md) Could you provide a link for a WIP Pull...
how would we proceed with the cli? I was thinking it would be nice to be able to run `devon configure system_prompt.txt (optional) last_prompt(optional)` and infers correct class from it,...
As of now, Devon can't be run in headless mode, or am I wrong here? So wouldn't a requirement for this issue be a headless mode for Devon? Idea: spin...
I *think* with #53 it should be possible to use Gemini, but might need to check LiteLLM support for it.
On it, will report back here with updates and request for discussions 🫡
PR updated here: #53
So, might have going out of the way with the refactorings. Also pulled from main and added changes so not to break ollama support. I'm doing more comprehensive regression testing...
Hey @ObjectJosh ! Thanks for the feedback. Yes, makes sense, I will move to a table of contents and see how it looks like.