an stack size error with HH/main.py
when I run HH/main.py with the following setting:
parser.add_argument('--modelname', type=str, default="./models/vicuna-7b-v1.5")
I get an error:
for epoch_test, batch_test in tqdm(enumerate(testloader)):
Exception has occurred: RuntimeError
stack expects each tensor to be equal size, but got [18] at entry 0 and [22] at entry 1
File "/Buddha/cjs/CTG/RAIN/RAIN-main/HH/main.py", line 492, in
I checked this error from internet and got that I needed to set the max_length and padding of the tokenizer. So could you please provide the max_length and padding of the tokenizer? So I can get the same results that you had gotten in your paper.