CogVLM
CogVLM copied to clipboard
could it finetune in qlora or quant 4/8?
i have only a A40(48GB),how could i do to finetune the model? Althought i use lora ,i still meet the problem that "cuda out of memory". So i want to ask that could it finetune in qlora or quant4/8?
补充一个问题:尝试用quant 4 量化 cogAgent的时候,会报一个维度不匹配的错误: File "/root/miniconda3/envs/CogVLM/lib/python3.10/site-packages/sat/model/finetune/lora2.py", line 97, in init self.original.weight.data.copy_(original_obj.weight.data.detach().clone()) RuntimeError: The size of tensor a (4096) must match the size of tensor b (2048) at non-singleton dimension 1
没有办法使用qlora