ganakee
ganakee
Thank you for the reply. No, I do not see any option. On Linux, the system theme should apply (GNOME 40 and above) to the app. I don't think that...
Wow. Thanks @53R3n17Y-serenity
@marianoarga I looked at your ollama.service file. In the service section, there is one: Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin OLLAMA_MODELS=/opt/ai/models" > > [Service] > ExecStart=/usr/local/bin/ollama serve > User=ollama > Group=ollama > Restart=always > RestartSec=3...
Would like to see this.
I am having this same issue. After compiling ollama for AMD GPUS, I used the manual install method. I put the ollama.service file in /etc/systemd/system. ``` [Unit] Description=Ollama Service After=network-online.target...
I did several more hours of work on this. The issue seems to be somehow with copying the custom-compiled file to /usr/bin/local/ollama.gpu . No matter what I do, if I...
OK. If anyone else gets this issue, the problem for me was with the custom-compiled version of ollama and a missing override environment variable in the systemd config file. I...
I worked with this issue some more without resolution. I am trying to use the new Ollama Cloud OPenAI compatible models. I verified (there was a bug in the Ollama...
**The compile still fails even with 0.4.1--newly released..** I just tried (2024-11-10) with the newly released 0.4.1. I was not clear whether this includes @dhiltgen 's updates. 1. I downloaded...
Report Date: 2024-11-17 **The error still occurs.** I downloaded the 0.4.2 version from 2024-11-14. I first run: `ROCM_PATH=/opt/rocm CLBlast_DIR=/usr/lib/cmake/CLBlast AMDGPU_TARGETS="gfx1030" /usr/local/go/bin/go generate -tags rocm ./...` I then run where the...