void
void copied to clipboard
[Feature]: Add support for local llama.cpp
Please add explicit support for Llama.cpp API during first start. It is OpenAI compatible - so you have everything implemented - but not everybody knows it is OpenAI compatible...
Good idea - open to having someone implement this if we don't get around to it!