bigprince97

Results 3 comments of bigprince97

if input is English, is display rightly, but it can't display total chinese.

``` class MyBertMLM(lit_model.Model): MASK_TOKEN = "[MASK]" @property def num_layers(self): return self.model.config.num_hidden_layers @property def max_seq_length(self): return self.model.config.max_position_embeddings def __init__(self, model_name="bert-base-chinese", top_k=10): super().__init__() self.tokenizer = transformers.AutoTokenizer.from_pretrained(model_name) self.model = transformers.BertForMaskedLM.from_pretrained( model_name, output_hidden_states=True, output_attentions=True)...

and I attempt to change attention_module.ts, it don't work.