Timmo
Timmo
> Instead of running oobabooga to serve your local LLM, you can use LocalAI instead. It gives you an OpenAI compatible API based on whatever model you choose to run....
Start oobabooga with `call python server.py --auto-devices --chat --wbits 4 --groupsize 128 --api --listen --extension openai` Then enter the API url in the model in the UI http://127.0.0.1:5001/v1 However a...
> > However a LLMs might have different Keywords like "### Instruction:" or "User:" > > Where do find these keywords? Any template-suggestions for some known LLMs? 🙏🏽 This might...
Meanwhile I built my own using go and the "pack.ag/tftp" library. Basically a one-liner and works on every OS
I had the same issue with Ollama. Then I entered a valid default model and it works  I think in LocalAI they only return the you requested but not...
Try to use mp3 as output format. It seems like the wav header is corrupted (but the data of the whole text2audio is in the wav file)