Denis Spasyuk
Denis Spasyuk
the old convention is not a problem, we can easily promisify those functions, I have done this already for some functions, as for ib-tws-api, I tried it and I do...
I have this issue too using setOption, can add more series but cannot unset the old one. How do I fix that?
Model seems working fine on my end, just endlessly generating text. `../llama.cpp/main --model /home/denis/Downloads/phi-3-mini-4k-instruct.Q8_0.gguf --n-gpu-layers 35 -ins --interactive --keep -1 --n-predict -1 --simple-io -b 2048 --ctx_size 0 --temp 0.1 --top_k...
@h9-tect Did you figure it out?
@Revmagi @mapleroyal Try the guide (script piper_install_mac.sh) here, let me know if you have any issue: https://github.com/dspasyuk/llama.cui
Hi @ruleset, have you tried running the script I provided in https://github.com/dspasyuk/llama.cui repo? If you want to redo compilation manually you will need to set paths to all the brew...
I ended up using just llama.cpp. Works very well on the GPU. You can write a simple wrapper in nodejs without rust. I can share the code if you want.
@shaileshminsnapsys no problem the code is here https://github.com/deonis1/llcui
Let me know if you have any issues
@shaileshminsnapsys no problem, there is a new version if you are interested