Иван
Иван
The current template, when working through ollama, allowed the model to radically change the response embedded in the context. Up to the point of completely refusing to answer on the...
Encountered this error. I created a [Dockerfile](https://github.com/oobabooga/text-generation-webui/blob/main/docker/amd/Dockerfile) according to the suggested instructions with the installation of libraries via a file start_linux.sh . Analyzing the code, I came to the conclusion...
I updated the [rocm drivers to version 6.2](https://github.com/HardAndHeavy/transformers-rocm-docker/blob/2020e0e2f392c4ba584bcd120e4fb8a37c3f06f7/Dockerfile#L5). After that, [llama-cpp-python 0.2.90](https://github.com/HardAndHeavy/llama-rocm-docker/blob/099a2630f8ebc46f75671498d1f248cc77c52ab9/Dockerfile#L5) was installed without problems.
Thanks a lot for the correction. It helped me a lot with the installation.