Results 5 comments of Shkklt

File "/home/xxxx/.cache/huggingface/modules/transformers_modules/chatglm2/modeling_chatglm.py", line 438, in forward context_layer = self.core_attention(query_layer, key_layer, value_layer, attention_mask) File "/home/xxxx/anaconda3/envs/linglong0.1/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, **kwargs) File "/home/xxxx/.cache/huggingface/modules/transformers_modules/chatglm2/modeling_chatglm.py", line 228, in forward context_layer = torch.nn.functional.scaled_dot_product_attention(query_layer,...

> [](/ptonlix) [ptonlix](/ptonlix) changed the title ~community:ChatZhipuAI:ADAPTS to new version zhipuai libraryNew version ChatZhipuAI~ community:ChatZhipuAI:ADAPTS to new version zhipuai library [Jan 24, 2024](#event-11572610109) 大佬代码在哪里?

./anaconda3/lib/python3.8/site-packages/deep_training/nlp/models/lora/configuration.py 下面默认一个CONFIG_NAME ="adapter_config.json" 而不是config.json. 我改了,但是编译文件没生效。。

这个问题有解决么,transformers 4.44.0也会报这个错误