Haixia

Results 3 issues of Haixia

### 🐛 Describe the bug In stage 1 and stage 3, the size (about 13GB) of the training model is very big using the peft training way. I have set...

bug

### 🐛 Describe the bug when i use the "colossalai_zero" strategy to train the RM model, it will spend a lot of time to load optimizer . I am very...

bug

您好,我使用了你们CodeGeex的vscode 插件,性能表现的真的是amazing! 很感谢你们能够share如此棒的工作!! 但是,我还有一个疑问,当我下载codegeex2-6b 尝试infer时,模型并不能跟从指令来完成一些任务。**方便告知使用模型权重来进行推理时,应该如何设置prompt吗?** 例子:要求模型实现代码注释的功能 ``` import os os.environ['CUDA_VISIBLE_DEVICES'] = '1' from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("/localFile/codegeex2-6b",trust_remote_code=True) model = AutoModel.from_pretrained("/localFile/odegeex2-6b",device='cuda',trust_remote_code=True) # 如使用CPU推理,device='cpu' model = model.eval() prompt =...