Faizan Bashir
Faizan Bashir
Did anyone find a way to fix this? I am facing the same issue.
Could anyone please respond from Kong's side?
Hi @pmalek, Good to know. I'm looking forward to the documentation. Could you update this issue once the documentation is ready?
I downloaded a GGUF model, but it still gives the `Error: ` message without any detailed error message. ```shell llm llama-cpp download-model \ https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGUF/resolve/main/llama-2-7b-chat.Q8_0.gguf --alias llama2-chat --alias l2c --llama2-chat ```...