Yifeng Zhang
Yifeng Zhang
我们有多个模型,需要做通用inference,需要载入多个模型 如果只load任意一个模型都是ok的 但如果load一个模型后,load另一个模型,如果其中一个模型用到lod_tensor,就会报错。 函数代码如下: place = fluid.CPUPlace() exe = fluid.Executor(place) [inference_program, _, fetch_targets] = ( fluid.io.load_inference_model(dirname=model_path[0], executor=exe, model_filename=model_path[1], params_filename=params_path[1])) 错误如下: Error Occurs, info:enforce version == 0U failed, 1015534361 != 0...
File channel tracks transferred events and use transnational mechanism to make transfer recoverable. However, it increases CPU cost due to frequent system calls like write, read, etc. The Cpu cost...