nicosouth

Results 6 comments of nicosouth

> Should I worry about this warning? `[WARNING] [stage3.py:1939:step] 13 pytorch allocator cache flushes since last step. this happens when there is high memory pressure and is detrimental to performance....

do you solve the problem? i have the same problem.

ok, this is my script, i just add the "--preprocessing_num_workers 4" """"""""" model_name_or_path=/home/llm/model/Qwen1.5-1.8B dataset_path=/home/llm/data/text_test/ output_dir=/home/llm/model/output_models/finetune conversation_template=empty trust_remote_code=True while [[ $# -ge 1 ]]; do key="$1" case ${key} in -m|--model_name_or_path) model_name_or_path="$2"...

> > i use the ShuSheng dataset and convert data into the format required by lmflow. > > What's the type of that dataset, is it `text_only`, `text2text`, or `conversation`?...