llama-cpp-agent icon indicating copy to clipboard operation
llama-cpp-agent copied to clipboard

The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured ou...

Results 28 llama-cpp-agent issues
Sort by recently updated
recently updated
newest added

In a multi-turn conversation I see that the combination of llama-cpp-python and llama-cpp-agent is much slower on the second prompt than the python bindings of gpt4all. See the 2 screenshots...

Is there a way to stop inference manually? E.g. such as by returning FALSE to the streaming_callback? If the user presses the stop button in a UI how could that...

I updated my GUI to your new 0.2.2 version. It now works as long as I do not set top_p, top_k, or repeat_penalty. these give e.g.: ``` File "/home/wolfgang/.local/lib/python3.10/site-packages/llama_cpp/_internals.py", line...

Thanks this bravo project, that can give easy format constrain of llama-cpp. And where's the RAGColbertReranker usage ? That I can try a usage example on agent ability with the...

Hello. Our VMs are already supporting Ollama server and I want to reuse with this projects. I tried a few things but no luck. Any suggestions will be appreciated. model...

I somehow got llama 3 in a message loop after i said 'What', it somehow got to `user` before activating message mode? and then it started talking to itself (which...

Was wondering if it was possible to pass in a callback function (similar to `send_message_to_user_callback`) to `FunctionCallingAgent` that fires whenever a tool is used? My use case is that I...

This issue is related to https://github.com/ggerganov/llama.cpp/issues/5130. It looks like the current version of `llama-cpp-agent` can't handle `Literal`. In case anyone faces this issue, the easy fix is to use `Enum`...