Chad Canning
Chad Canning
newest version of torch works fine here.
im not sure why it isnt being fixed, but this worked for me: https://github.com/abetlen/llama-cpp-python/discussions/871 `set "CMAKE_ARGS=-DLLAMA_CUBLAS=on" && pip install llama-cpp-python`
I now have the same extra new lines issue out of no where, running the latest version.
Still having the same issue with 2.5.5