ChatGLM-6B
ChatGLM-6B copied to clipboard
[BUG/Help] <title> 全参数finetune需要占用基础显存?
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
1 batch,A100 80G显存装不下
Expected Behavior
No response
Steps To Reproduce
NONE
Environment
- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :
Anything else?
No response