Phillip Lindsay
Phillip Lindsay
It would nice to be able to use ollama with local LLMs versus github co-pilot.
### Prerequisites - [X] I am running the latest code. Mention the version if possible as well. - [X] I carefully followed the [README.md](https://github.com/ggerganov/llama.cpp/blob/master/README.md). - [X] I searched using keywords...
### What would you like to see? emini-1.5-pro-exp-0827 | Gemini 1.5 Pro | Quality improvements for Gemini 1.5 Pro | August 27, 2024 gemini-1.5-flash-exp-0827 | Gemini 1.5 Flash | Quality...