FantasyBert icon indicating copy to clipboard operation
FantasyBert copied to clipboard

An easy-to-use framework for BERT models, with trainers, various NLP tasks and detailed annonations

FantasyBert

English | 中文

Introduction

An easy-to-use framework for BERT models, with trainers, various NLP tasks and detailed annonations.

You can implement various NLP task conveniently with many functions such as adv training, fp16, gradient clip, r-drop, early stop, etc.

Installation

pip install fantasybert

The lastest verion is 0.1.3

Tutorials

tutorial

Task in examples

semantic text similarity

Datasets and Results

The datasets are downloaded on CLUE.

TNEWS IFLYTEK AFQMC CMNLI CMRC CLUENER
BERT_pub 56.84 59.43 74.07 80.42 73.95 78.82
BERT_our 56.63 .. 72.75 .. .. 79.669

For detailed results, please see here.

BERT_pub denotes the public results by CLUE, BERT_our denotes the results by FantasyBert. The pretrained model of clue and fantasybert is chinese-bert-wwm-ext model, and the predictions of test datasets is evaluated on CLUE, more results of clue public is here.

Others

Some code are edited on transformers(tokenizaiton) and fastnlp(trainer), I simplified the code and added some new functions.

The part of models directly uses the pretrain model in transformers, I tried write bert models in bert4pytorch, but due to time limit and lack of ability, it failed to achieve the quality and efficiency of transformers as did.

This project is not for commerical use and is only for private use.