Kuldeep Luvani
Results
3
comments of
Kuldeep Luvani
> ollama now provides an OpenAI compatible API. So you don't need to use litellm any longer. > > That said, using local models comes with a different problem, i.e....
@handrew Do you think this works with llama3.2?
> @kuldeepluvani haven't tried it! You wanna give it a shot? Yeah, I will try this out and will keep you guys posted.