PPLM
PPLM copied to clipboard
unequal dimensions between the new_accumulated_hidden and the matrix in mlp in classifier
When I run the pplm when both bow and discrim are on, ('technology', 'sentiment', respectively), new_accumulated_hidden.shape[1] = 765 but the emb_size in mlp is 1024, the dimensions are not consistent in matmul in pplm_classification_head, so I am getting
RuntimeError: size mismatch, m1: [1 x 768], m2: [1024 x 5] when calculating the loss for the pertubed text.
Please correct me if I miss something, thank you very much for your help
I suspect this might have to do with using a GPT-2 model of the wrong size....
So I may need to train a sentiment classifier with a compatible shape with GPT2. Thanks for your reply!