I trained bloom7b model but for inference they give me error
This is the code
batch = tokenizer("Two things are infinite: ", return_tensors="pt")
with torch.cuda.amp.autocast():
output_tokens = model.generate(**batch, max_new_tokens=50)
print("\n\n", tokenizer.decode(output_tokens[0], skip_special_tokens=True))
The error is
AttributeError: 'NoneType' object has no attribute 'device'
Please see https://github.com/huggingface/peft/issues/115, it's probably the same thing.
In my case (using alpaca model) I had to add a device_map parameter:
model = PeftModel.from_pretrained(
model,
alpaca_path,
torch_dtype=torch.float16,
device_map={"": 0}
)
Please see #115, it's probably the same thing.
In my case (using alpaca model) I had to add a
device_mapparameter:model = LlamaForCausalLM.from_pretrained( model_path, load_in_8bit=True, torch_dtype=torch.float16, device_map="auto", ) model = PeftModel.from_pretrained( model, alpaca_path, torch_dtype=torch.float16, device_map={"": 0} )
Thank you.
Please I m confused about the second one. You load two model. Which one should need to follow. !!
Sorry, should be the second one, I’ll edit the original comment.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.