ChatGLM-6B
ChatGLM-6B copied to clipboard
[BUG/Help] <lm_head.weight权重>
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
为什么模型加载时缺少该权重:lm_head.weight
@mojianhao @AdamBear @jsl9208 @cjld
不在model.named_pamaters也不在model.named_buffers,这样的权重经过skip_init只有随机初始化,不进行梯度更新吗
Expected Behavior
为什么模型加载时缺少该权重:lm_head.weight
不在model.named_pamaters也不在model.named_buffers,这样的权重经过skip_init只有随机初始化,不进行梯度更新吗
Steps To Reproduce
- 加载模型
Environment
- OS:linux
- Python:3.8
- Transformers:4.27
- PyTorch:1.10
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :
Anything else?
No response