RyanHuangNLP
RyanHuangNLP
I want to load the pretrain checkpoint to continue train on my own corpus, I use the `run_pretrain.py` code and set the init_checkpoint to the pretrain dir, while I run...
how to use the pretrain RoBERTa’s checkpoint, I was doubt that whether use the pretrain position embedding in Roberta
Does lightseq support triton for the back end for inference?
我使用run_pretrain.py代码,指定好训练好的model_dir,但是会抛异常Key bert/embedding/Layernorm/beta/lamb_m not found, ``` ERROR:tensorflow:Error recorded from training_loop: Restoring from checkpoint failed. This is most likely due to a Variable name or other graph key that is missing from...
Starting 4 workers for building datasets ... Traceback (most recent call last): File "preprocess.py", line 133, in main() File "preprocess.py", line 128, in main dataset.build_and_save(args.processes_num) File "UER-py-master/uer/utils/data.py", line 221, in...
[ernie-3.0](https://github.com/PaddlePaddle/PaddleNLP/tree/develop/model_zoo/ernie-3.0)
The format of WNLI is known for hard to learn by model, so RoBerta apply WNLI trick, which use wsc data from super glue and extract candidates from schemas, on...