InternLM
InternLM copied to clipboard
[Bug] Need to import torch
Describe the bug
It is a small problem. When I follow REAMDE Import from Transformers to initialize model, a error (NameError: name 'torch' is not defined) occurs. So it is need to import torch .
Also, I have a problem if internlm-chat-7b can be load on multi gpus since my machine is 11G x 8.
>>> from transformers import AutoTokenizer, AutoModelForCausalLM
>>> tokenizer = AutoTokenizer.from_pretrained("internlm/internlm-chat-7b", trust_remote_code=True)
>>> model = AutoModelForCausalLM.from_pretrained("internlm/internlm-chat-7b", torch_dtype=torch.float16, trust_remote_code=True).cuda()
Environment
torch 1.13.1+cu117
torch-scatter 2.1.1+pt113cu117
torchaudio 0.13.1+cu117
torchvision 0.14.1+cu117
Other information
No response