LLamaSharp
LLamaSharp copied to clipboard
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
This PR separates `Sequence` from `Conversation`. It introduces no change from the user's view but only change the internal implementation. In llama.cpp APIs, the sequence is only a concept. There...
Implement the proposal in #670 ## Public API changes ### Add `NativeLibraryConfig.WithAutoDownload` By default it's disabled and the user could enable it with some settings. ### Add `NativeLibraryConfig.WithSelectingPolicy` To allow...
I assume it is not possible for a locally running chatbot to perform an online search during a conversation to retrieve current information, e.g. weather or news?
### Background & Description 经过测试,默认模板是无法作用应用于 Qwen1.5 模型里的,会出现结果混乱。 在增加自定义模板后,可以正常输出结果。 现在的想法是,类似常见的模型的问答模板,要怎样整合到 LLamaSharp 基础项目里。 using var model = LLamaWeights.LoadFromFile(parameters); using var context = model.CreateContext(parameters); var ex = new InteractiveExecutor(context); ChatSession session = new...
This PR should only be merged after #688. It supports dynamic native library loading but needs more test. ### Checks - [x] Windows .NET framework 4.8 console app - [...
### Description How should I give system information to LLM so that it can use it when there is a question that requires this information?
As discussed in #695, Here's a PR that generates the shared library for Android. I have also added the new binaries to be copied in the CPU Backend ``.nuspec`` file....
Hi all, thanks to the community effort, LLamaSharp has had much richer features than the beginning. Meanwhile, the distributions of backend package may be changed soon. Therefore I think it's...
### Description Sometimes unexpected behaviors might appear when the conversation is very long. We need to add unit test for it.
### Description Currently logs could be shown for debug if the native log is enabled. However to make things easy it's better to add more information in higher-level implementations. This...