tried to install, not sure what to do with this?
Describe the bug
i pip installed that went fine then i tried running it and got this is there a config im missing? : ValidationError: 1 validation error for LlamaCpp root Could not load Llama model from path: ./models/mixtral-8x7b-instruct-v0.1.Q2_K.gguf. Received error Model path does not exist: ./models/mixtral-8x7b-instruct-v0.1.Q2_K.gguf (type=value_error)
Reproduction
No response
Operating System
ubuntu, aarch64
Libre Chat version
??latest??
Additional context
No response
by some reason it does not download the model directly.
you have to download it like said in the chat.yml model_path: ./models/mixtral-8x7b-instruct-v0.1.Q2_K.gguf model_download: https://huggingface.co/TheBloke/Mixtral-8x7B-Instruct-v0.1-GGUF/resolve/main/mixtral-8x7b-instruct-v0.1.Q2_K.gguf
and put it in a folder /models which has to be there where you go with libre-chat start