Ki-Hwan Oh
Ki-Hwan Oh
After creating the environment, install Pytorch and CUDA toolkit using the following line. `conda install pytorch==1.8.0 torchvision==0.9.0 torchaudio==0.8.0 cudatoolkit=11.1 -c pytorch -c conda-forge` It will take some time to fix...
I was able to run ollama with my 9070xt. Do you have the latest ROCm 6.4.0 installed?
I had no issues using the main ollama, and it recognized my GPU during installation. Perhaps the forked version isn’t updated for the 9070?
@doomaholic So I'm running on Ubuntu 24.04 LTS with ROCm 6.4.0 installed. I will also try on Windows to see if that's the issue.
I’ve tried several setups: running Ollama on Windows, building Ollama with ROCm on Windows, and installing both Ollama and ROCm on WSL (Ubuntu 24). Unfortunately, none of them were able...