Srijan Srivastava

Results 4 comments of Srijan Srivastava

I get this warning even if I set CUDA_HOME. But, I am using the CUDA Docker image so I am not sure if anyone else gets it in the Docker...

> I experience the same issue with CUDA Docker image. Did you find a solution? I run this script for my backend inside the container: ```sh # !/bin/bash if [...

I am getting the same error. I can chat with the model through CLI by running `ollama run `. However, when I try to connect with API, I am getting...

Turns out if I remove `decord`, it works perfectly fine. Anyone knows why?