robik72
robik72
### Describe the bug I am using open interpreter on a MacOS 12.7.3 notebook. after installing ollama and downloading tinyllama and phi, I have launched it with the --model. i...
### Describe the bug Hi, I am getting the infamous "OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root." when i try to load...
I am trying to use air llm on my pc (win11, 32gb ram, rtx 3080 with 10gb vram) to run llama 3 70b. After downloading llama 3 quantized at 4bit...