limbo92
limbo92
Facing same issue! deepspeed 0.95 transformers 4.29.2 peft 0.4.0.dev0 CUDA Version: 11.4
> I got this error when I downgrade transformer version from 4.30.2 to 4.29.2 to fix another issue(Found `optimizer` configured in the DeepSpeed config, but no `scheduler`). When I revert...
@kanslor I add this to deepspeed.config file "scheduler":{ "type":"WarmupLR", "params": {"warmup_min_lr":"auto", "warmup_max_lr":"auto", "warmup_num_steps":"auto" } } And I also faced OOM error, so I give up using official fine-tune method. I...