ghozn

Results 3 comments of ghozn

pytorch-lightning 1.4.9 torch 1.7.0 transformers 4.3.3 这是我安装的包的版本,报错一致。请问大佬可以给出详细的requirement吗?感谢

`dyn_axis_general = {0: "batch", 1: "sequence"}` `dyn_axis = { "input_ids": dyn_axis_general, "encoder_attention_mask": dyn_axis_general, "encoder_hidden_states": dyn_axis_general, "logits": dyn_axis_general, "output_past_key_values": dyn_axis_general, }` The input_ids, encoder_attention_mask, encoder_hidden_states in the init-decoder should not have...

Running 6 test cases... unknown location:0: fatal error: in "TrieAll": lm::FormatLoadException: A 2-gram has context 0 so this context must appear in the model as a 1-gram but it does...