BotSharp icon indicating copy to clipboard operation
BotSharp copied to clipboard

LlamaSharp Backend Cuda not being used

Open gkapellmann opened this issue 2 years ago • 1 comments

In the Webstarter project by default we can find LlamaSharp.Backend.Cuda11 dependency.

When I added the LlamaSharp.Backend.CPU dependency to the project, it works well.

I have installed the CUDA libraries directly from Nvidia, and there is a RTX 2060 in my pc, and it has been properly detected.

However, the Webstarter project still doesnt seem to accept the Cuda support. I have also tried with LlamaSharp.Backend.Cuda12, but no luck.

Is there a setting that needs to be set?

I get the following error:

System.TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception. ---> LLama.Exceptions.RuntimeError: The native library cannot be found. It could be one of the following reasons:

  1. No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them.
  2. You are using a device with only CPU but installed cuda backend. Please install cpu backend instead.
  3. The backend is not compatible with your system cuda environment. Please check and fix it. If the environment is expected not to be changed, then consider build llama.cpp from source or submit an issue to LLamaSharp.
  4. One of the dependency of the native library is missed.

gkapellmann avatar Oct 20 '23 16:10 gkapellmann

Hi, it should be an issue of LLamaSharp. Are you still meeting this issue when using the latest release of LLamaSharp?

SanftMonster avatar Mar 15 '24 09:03 SanftMonster

This question was long time ago, and now there has been too many updates. I can confirm that I just tried again and all seems to be working properly.

gkapellmann avatar Aug 13 '24 11:08 gkapellmann