Tom

Results 5 comments of Tom

Yes i followed everystep of the readme. I may have problem with cuda because my gpu isn't used by llm model also but dont know how to solve it. I...

I did it and it still does that, and I am also unable to dowland llama_cpp on those set CMAKE_ARGS=-DLLAMA_CUBLAS=on

I tried on computer and laptop, got the same result

@MikeBirdTech It doesnt work on local model either, i should have metioned that @MikeBirdTech

I have reinstalled python and interpreter, if i try to use openai it gaves me the same error if I use local it dowlands Phi model and the opens local...