shidingz
Results
3
comments of
shidingz
I later found out that it was because my batch size (bs) was set to 1. If it is 1, images_aux in the dataset is a list; torch.stack is only...
我发现llama3的模板有些问题,如果设计多轮对话会出现 WARNING: tokenization mismatch 在 def preprocess_llama_3( sources, tokenizer: transformers.PreTrainedTokenizer, has_image: bool = False ) -> Dict: 这个函数里这部分代码是不是不太对 # include for all rounds cur_len = 1 target[:cur_len] = IGNORE_INDEX for...