opsec-ai
opsec-ai
I tried changing it up using `uv` and pulling everything fresh. ``` uv pip uninstall llama-cpp-python uv pip install llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu126 Using Python 3.12.10 environment at: /home/k/Downloads/src/chatterbox/.venv Resolved 6...
It's still using libllama.so from git. Solution? Rename the llama-cpp-python git directory and run the same commands yet again. mv llama-cpp-python llamacp uv pip uninstall llama-cpp-python CMAKE_ARGS="-DGGML_CUDA=on" uv pip install...
The git version is still bugged. I just re-cloned everything from adam.
I got it to load by removing all the debugging lines that were causing errors with ctypes. But it's far from working. Need to find where the exports moved to....
It's not working for CUDA 12.8 either. Had to edit CMakeLists.txt to remove refs. to `llava`. And that was just the beginning of the adventure. I tried putting LLAMA_LLAVA=OFF in...
Llama.cpp is a bit of a hot target right now with daily changes. The stock llama-cpp-python compiled out of the box for me without errors last month, but "it didn't...
CUDA: Since Visual Studio 2022 - currently doesn't support BMI2 on Haswell+, - & versions newer than 2022 don't work with CUDA 13... The resulting binaries could be twice as...
Uhhh, new guy here. Just stumbled onto this repo. Maybe they don't know how to answer because they don't know what API you're using. Better yet, a code snippet that...
Need to separate arguments with space. `--diarize --convert` The error message, "Received request: /tmp/tmp8y9c3cb9.wav Couldn't open input file RIFF$ error: failed to ffmpeg decode 'RIFF$' error: failed to read audio...