trl icon indicating copy to clipboard operation
trl copied to clipboard

How do I make prediction after saving the model to local machine?

Open rv-ltran opened this issue 3 years ago • 0 comments

I tried to save the model to local machine and make prediction from it. However, the text generated from that saved model is not as expected. Do you know of the correct way to do it?

Code:

os.makedirs('gpt2-imdb-ctrl')
gpt2_model.save_pretrained('/gpt2-imdb-ctrl')
gpt2_tokenizer.save_pretrained('/gpt2-imdb-ctrl')

from transformers import AutoTokenizer, AutoModel

model = AutoModel.from_pretrained("/gpt2-imdb-ctrl")

tokenizer = AutoTokenizer.from_pretrained("/gpt2-imdb-ctrl")

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

_ = model.to(device)

input_string = "[negative] And it's one of the rare old films that you still fall asleep in a while"
input_tokens = tokenizer.encode(input_string, return_tensors="pt").to(device)

response_tensors = respond_to_batch(model, input_tokens, txt_len=30)
response_strings = tokenizer.decode(response_tensors[0, :])
response_strings

Results:

'rararararararararararararararararararararararararararararara'

rv-ltran avatar Apr 13 '22 20:04 rv-ltran