OpenPrompt icon indicating copy to clipboard operation
OpenPrompt copied to clipboard

'PrefixTuningTemplate' object has no attribute 'n_head'

Open SuperChanS opened this issue 2 years ago • 2 comments

Here is my code of trying to use PrefixTuningTemplate:

import torch
from openprompt.data_utils.conditional_generation_dataset import WebNLGProcessor
from openprompt.plms import load_plm
from openprompt.prompts.prefix_tuning_template import PrefixTuningTemplate

plm, tokenizer, model_config, WrapperClass = load_plm('opt', "facebook/opt-125m")
mytemplate = PrefixTuningTemplate(model=plm,  tokenizer=tokenizer, text=' {"placeholder":"text_a"} {"special": "<eos>"} {"mask"} ', using_decoder_past_key_values=True)

An error is encountered:

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ in <module>                                                                                      │
│                                                                                                  │
│   1 from openprompt.prompts.prefix_tuning_template import PrefixTuningTemplate                   │
│ ❱ 2 mytemplate = PrefixTuningTemplate(model=plm,  tokenizer=tokenizer, text=' {"placeholder"     │
│   3                                                                                              │
│                                                                                                  │
│ /root/gpt_exp/OpenPrompt/openprompt/prompts/prefix_tuning_template.py:77 in __init__             │
│                                                                                                  │
│    74 │   │   │   self.n_head = self.config.n_head                                               │
│    75 │   │   │   self.match_n_decoder_layer = self.n_decoder_layer                              │
│    76 │   │   self.mid_dim = mid_dim                                                             │
│ ❱  77 │   │   self.match_n_head = self.n_head                                                    │
│    78 │   │   self.match_n_embd = self.n_embd // self.n_head                                     │
│    79 │   │   self.prefix_dropout = prefix_dropout                                               │
│    80 │   │   self.dropout = nn.Dropout(self.prefix_dropout)                                     │
│                                                                                                  │
│ /home/kg/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py:1208 in __getattr__    │
│                                                                                                  │
│   1205 │   │   │   if name in modules:                                                           │
│   1206 │   │   │   │   return modules[name]                                                      │
│   1207 │   │   raise AttributeError("'{}' object has no attribute '{}'".format(                  │
│ ❱ 1208 │   │   │   type(self).__name__, name))                                                   │
│   1209 │                                                                                         │
│   1210 │   def __setattr__(self, name: str, value: Union[Tensor, 'Module']) -> None:             │
│   1211 │   │   def remove_from(*dicts_or_sets):                                                  │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯

SuperChanS avatar Feb 28 '23 08:02 SuperChanS

@SuperChanS where you able to fix it?

tresiwald avatar Mar 15 '23 15:03 tresiwald

Hi, currently the __init__ function in PrefixTuningTemplate cannot handle OPT, you may need to modify it by yourself. Pull request is welcomed.

https://github.com/thunlp/OpenPrompt/blob/8756a8cf5af4e7e01db3d612d74cb4cd1d927aba/openprompt/prompts/prefix_tuning_template.py#L64

yulinchen99 avatar Mar 30 '23 06:03 yulinchen99