charlesfufu
charlesfufu
 Do you know what's reason cause these problem?
请问roberta为什么全采用BertConfig, BertForSequenceClassification, BertTokenizer? 为什么不采用RobertaConfig, RobertaForSequenceClassification, RobertaTokenizer?之间有什么区别吗
Hello, you said that the model pre-train effect means training on the basis of the Chinese pre-train model provided by Google, adding the data on hand, and then training the...
LoggingTensorHook为什么输出不了结果
准确率太低


请问其中依赖的预训练模型是否和bert官方提供是一样的?如果我用其他外语是否可以直接从bert官网中下载相关多语言模型
After running the model,found model effect is not very optimistic less than the level of state of art.The accuracy rate of model is only 50%. Do you know what's main...
  kg_demo_movie_mapping.ttl文件中漏定义本体中的Comedian,以及对象属性中的“hasActor”
多标签分类问题,怎么评估取top前几概率大的当作类别?比如你这边取概率最大的前5个。怎么 去定义刻画这个5呢,为什么不能取前面三个