LudoSGitHub
Results
2
comments of
LudoSGitHub
Hi Trying to make it work. First of all, the get ready says to launch the chat.exe but it does not come with the source. I tryed to compile with...
I dowloaded the .zip with executable but I have the same issue than others: D:\01 - PROJETS\Python\alpaca.cpp2>chat.exe -m ggml-alpaca-13b-q4.bin main: seed = 1680512416 llama_model_load: loading model from 'ggml-alpaca-13b-q4.bin' - please...