llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Remove logging chats

Open aiaicode opened this issue 1 year ago • 9 comments

./main -m model.gguf -c 512 -t 4 -b 1024 -n 256 --keep 48 --repeat_penalty 1.0 -i -r "User:" -f /folder/chat-with-bob.txt

The above command logs chats in the home directory in linux as main.123456XX.log

  1. Is the log really needed? Can it be done in the RAM instead?
  2. Can it be deleted automatically after the use or before?
  3. Are there other directories where logging is happening?

aiaicode avatar Feb 22 '24 09:02 aiaicode

./main --help
  --log-disable         Disable trace logs
  --log-file            Specify a log filename (without extension)

supportend avatar Feb 22 '24 09:02 supportend

./main --help
  --log-disable         Disable trace logs
  --log-file            Specify a log filename (without extension)

Wouldn't it be better to disable it by default.

aiaicode avatar Feb 22 '24 09:02 aiaicode

Under Linux you could create an alias: alias llama='./yourdirectory/main --log-disable' But i never tried it. A script in the llama.cpp directory could do the same. And sure, you could change the default in the source.

supportend avatar Feb 22 '24 09:02 supportend

Wouldn't it be better to disable it by default.

In some cases it is better, in some cases it is not. I prefer to be enabled by default

ggerganov avatar Feb 22 '24 09:02 ggerganov

Wouldn't it be better to disable it by default.

In some cases it is better, in some cases it is not. I prefer to be enabled by default

@ggerganov I would vote against keeping it enabled by default.

Reasons being :

  1. I don't think users want to look at their previous chats/logs or wish to save it.
  2. I don't think users would even know that their local chats are being saved.
  3. Other than developers wanting to look at logs (which you probably need for development). I don't know why it's needed.
  4. Major privacy concerns. If the system gets hacked or gets used by someone other than the owner, all their entire previous chats can be copied, the whole idea of using a locally running LLM is privacy.
  5. If i am not wrong, such logging is generally disabled by default in production environments.

This is a major privacy concern. Request you to please consider disabling it by default.

Also I'm only talking about logs and not about the "Save previous chat" feature which can exist in application built on top of llama.cpp.

aiaicode avatar Feb 22 '24 11:02 aiaicode

@aiaicode It depends on you viewpoint of the main program: is it a complete software or a testbed?

For us, main is more like a test implementation of llama.cpp (the real library). I believe that normal user would prefer to use a graphical UI that is based on llama.cpp library. There are some of them already listed on README. Personally, I've recently tried out jan, it's simple enough for non-code users to use.

ngxson avatar Feb 22 '24 12:02 ngxson

@aiaicode It depends on you viewpoint of the main program: is it a complete software or a testbed?

For us, main is more like a test implementation of llama.cpp (the real library). I believe that normal user would prefer to use a graphical UI that is based on llama.cpp library. There are some of them already listed on README. Personally, I've recently tried out jan, it's simple enough for non-code users to use.

I'm a user of llama.cpp and I use it directly without any UI in CLI mode as mentioned in the readme of llama.cpp.

I was under the impression that closing the terminal after running the main program clears everything and the chat is private, but turns out thats not the case. You never really know if a system is comprised. But disabling log would ensure a more safer environment, otherwise you can always track what a user is chatting about. A hack or physical theft of hard disk or a hard disk recovery software can bring back everything even after formatting the drive.

As soon as main program is run, the first line prints "Log start" after which 100 lines print and then interactive mode begins. The user never sees the "Log start" because its printed at lightning speed and the cursor at the chat now. At least the program should say "The chat will be saved locally in /folder/name in logs" in the == Running in interactive mode. ==

If the UI / Command line are not showing the option of going through previous chats or clearly saying "The chats will be saved locally, you can disable with --log-disable flag", then the chats shouldn't be saved in logs.

It's not just about me, I can disable it now but other users of llama.cpp should know that their chats are saved locally on their machine. Even better, disable it by default.

aiaicode avatar Feb 22 '24 14:02 aiaicode

For any commandline program i look for possible options and adjust them to my needs, because the defaults are optimal not often for my use cases. Sometimes i disable logging, sometimes it's enabled, mostly a conscious decision.

supportend avatar Feb 22 '24 14:02 supportend

I'm a user of llama.cpp and I use it directly without any UI in CLI mode as mentioned in the readme of llama.cpp.

You are confusing between "llama.cpp is a CLI program" vs "llama.cpp is a library that other programs can use"

Also, just for remind, source code of main program is inside /examples/main. This literally indicates that the "CLI" that you're talking about is indeed an example, not a complete product, so extensive logging is expected.

ngxson avatar Feb 22 '24 14:02 ngxson

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar Apr 07 '24 01:04 github-actions[bot]