Remove logging chats
./main -m model.gguf -c 512 -t 4 -b 1024 -n 256 --keep 48 --repeat_penalty 1.0 -i -r "User:" -f /folder/chat-with-bob.txt
The above command logs chats in the home directory in linux as main.123456XX.log
- Is the log really needed? Can it be done in the RAM instead?
- Can it be deleted automatically after the use or before?
- Are there other directories where logging is happening?
./main --help
--log-disable Disable trace logs
--log-file Specify a log filename (without extension)
./main --help --log-disable Disable trace logs --log-file Specify a log filename (without extension)
Wouldn't it be better to disable it by default.
Under Linux you could create an alias:
alias llama='./yourdirectory/main --log-disable'
But i never tried it.
A script in the llama.cpp directory could do the same.
And sure, you could change the default in the source.
Wouldn't it be better to disable it by default.
In some cases it is better, in some cases it is not. I prefer to be enabled by default
Wouldn't it be better to disable it by default.
In some cases it is better, in some cases it is not. I prefer to be enabled by default
@ggerganov I would vote against keeping it enabled by default.
Reasons being :
- I don't think users want to look at their previous chats/logs or wish to save it.
- I don't think users would even know that their local chats are being saved.
- Other than developers wanting to look at logs (which you probably need for development). I don't know why it's needed.
- Major privacy concerns. If the system gets hacked or gets used by someone other than the owner, all their entire previous chats can be copied, the whole idea of using a locally running LLM is privacy.
- If i am not wrong, such logging is generally disabled by default in production environments.
This is a major privacy concern. Request you to please consider disabling it by default.
Also I'm only talking about logs and not about the "Save previous chat" feature which can exist in application built on top of llama.cpp.
@aiaicode It depends on you viewpoint of the main program: is it a complete software or a testbed?
For us, main is more like a test implementation of llama.cpp (the real library). I believe that normal user would prefer to use a graphical UI that is based on llama.cpp library. There are some of them already listed on README. Personally, I've recently tried out jan, it's simple enough for non-code users to use.
@aiaicode It depends on you viewpoint of the
mainprogram: is it a complete software or a testbed?For us,
mainis more like a test implementation of llama.cpp (the real library). I believe that normal user would prefer to use a graphical UI that is based on llama.cpp library. There are some of them already listed on README. Personally, I've recently tried out jan, it's simple enough for non-code users to use.
I'm a user of llama.cpp and I use it directly without any UI in CLI mode as mentioned in the readme of llama.cpp.
I was under the impression that closing the terminal after running the main program clears everything and the chat is private, but turns out thats not the case. You never really know if a system is comprised. But disabling log would ensure a more safer environment, otherwise you can always track what a user is chatting about. A hack or physical theft of hard disk or a hard disk recovery software can bring back everything even after formatting the drive.
As soon as main program is run, the first line prints "Log start" after which 100 lines print and then interactive mode begins. The user never sees the "Log start" because its printed at lightning speed and the cursor at the chat now. At least the program should say "The chat will be saved locally in /folder/name in logs" in the == Running in interactive mode. ==
If the UI / Command line are not showing the option of going through previous chats or clearly saying "The chats will be saved locally, you can disable with --log-disable flag", then the chats shouldn't be saved in logs.
It's not just about me, I can disable it now but other users of llama.cpp should know that their chats are saved locally on their machine. Even better, disable it by default.
For any commandline program i look for possible options and adjust them to my needs, because the defaults are optimal not often for my use cases. Sometimes i disable logging, sometimes it's enabled, mostly a conscious decision.
I'm a user of llama.cpp and I use it directly without any UI in CLI mode as mentioned in the readme of llama.cpp.
You are confusing between "llama.cpp is a CLI program" vs "llama.cpp is a library that other programs can use"
Also, just for remind, source code of main program is inside /examples/main. This literally indicates that the "CLI" that you're talking about is indeed an example, not a complete product, so extensive logging is expected.
This issue was closed because it has been inactive for 14 days since being marked as stale.