george

Results 4 issues of george

你好,请问用tensorflow加载bert,是先将bert权重转成npz格式,再用`from_npz `加载么?`from_torch`和`from_pretrained`应该都都是加载的pytorch模型,不知道我理解得对不对?另外,请问gpt2现在还只能用onnx加速么?

我是根据interact.py改写的,生成结果用了最简单的概率采样,单轮聊天TF代码如下: ``` tokenizer4 = AutoTokenizer.from_pretrained("vocab/",sep_token="[SEP]",pad_token="[PAD]", cls_token="[CLS]") model4 = AutoModelForCausalLM.from_pretrained("model/",from_tf=True)#model文件保存了bin模型文件 sentence = "你好啊" text_ids=tokenizer4.encode(sentence,add_special_tokens=False) text_ids=new_user_input_ids input_ids = [tokenizer4.cls_token_id] input_ids.extend(text_ids) input_ids.append(tokenizer4.sep_token_id) input_ids = tf.constant(input_ids) #(None,) input_ids = tf.expand_dims(input_ids, axis=0) #(1,None) max_len=50 response...

may i use a bert-like model to load params of pre-train sdcup, then add some head top for task of table qa? when i look into pre-train sdcup, can i...

hello, if data not from zero, it doesn't work