SidK2003
Results
1
comments of
SidK2003
I haven't installed cuda toolkit and this worked for me pip install --no-cache-dir llama-cpp-python==0.3.2 --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu/