onnx export bug
When using this snippet export yolov3 asff model out:
# Initiate model
if args.asff:
if backbone == 'mobile':
from models.yolov3_mobilev2 import YOLOv3
print(
"For mobilenet, we currently don't support dropblock, rfb and FeatureAdaption")
else:
from models.yolov3_asff import YOLOv3
print('Training YOLOv3 with ASFF!')
model = YOLOv3(num_classes=num_class, rfb=args.rfb, asff=args.asff)
else:
if backbone == 'mobile':
from models.yolov3_mobilev2 import YOLOv3
else:
from models.yolov3_baseline import YOLOv3
print('Training YOLOv3 strong baseline!')
model = YOLOv3(num_classes=num_class, rfb=args.rfb)
if args.checkpoint:
print("loading pytorch ckpt...", args.checkpoint)
cpu_device = torch.device("cpu")
ckpt = torch.load(args.checkpoint, map_location=cpu_device)
# model.load_state_dict(ckpt,strict=False)
model.load_state_dict(ckpt)
if cuda:
print("using cuda")
torch.backends.cudnn.benchmark = True
device = torch.device("cuda")
model = model.to(device)
if args.half:
model = model.half()
model = model.eval()
print('model is ready..')
# input width is 608 and 608
dummy_input = torch.zeros((3, 800, 800)).to(device).unsqueeze(0)
onnx_f = 'weights/yolov3_asff.onnx'
torch.onnx.export(model, dummy_input, onnx_f,
verbose=True)
print('onnx exported successfully.')
The onnx model exported successfully, but inside the model, it only contains 2 constants, does anyone knows why? (you can also try using this script export yolov3 asff onnx, if you got this issue pls give me a comment)
When I export onnx, there are some troubles, it seems like your problem
@xuezhongcailian Did u found out why?
Maybe the onnx doesn't support the apex operators and the dcn module. It seems that the DCN module only supports the cuda version.
https://github.com/ruinmessi/ASFF/blob/master/models/yolov3_head.py(line 134). "return refined_pred.data" => "return refined_pred" (when you convert pth to onnx)