sddddd
Results
3
comments of
sddddd
``` def group_texts(examples): # Concatenate all texts. # print(type(examples)) concatenated_examples = {k: list(chain(*examples[k])) for k in examples.keys()} total_length = len(concatenated_examples[list(examples.keys())[0]]) # We drop the small remainder, we could add padding...
concatenated_examples['input_ids'] is one dimensional list
[BUG] Qwen3 MoE with FSDP2 meets `torch.utils.checkpoint.CheckpointError` when `offload_policy=True`
使用qwen3-30B-moe with FSDP2进行sft时,也遇到相同的问题;