Shurui

Results 4 comments of Shurui

> ### Feature request / 功能建议 > 目前打算在V100 16G 4卡微调,但是报错,推理没有问题。 https://github.com/THUDM/CogVLM2/blob/main/finetune_demo/README_zh.md 根据教程,需要每张显卡显存大于57G,请问后面是否可以支持多卡微调? > > ### Motivation / 动机 > 多卡微调 > > ### Your contribution / 您的贡献 > 无 hello,我目前正在用v100部署cogvlm2,可是输出稳定是:Floating...

> > try this: `pip install transformers==4.40` > > I tried using transformers 4.40.0; Whether I use quant8 or quant4, the following problem is thrown: > > > > >...

torch2.0.0+cu117, transformers4.40.0, bitsandbytes 0.43.1

I found v100 couldn't support bf16, and demo uses fp16. I wish it can help