research-contributions icon indicating copy to clipboard operation
research-contributions copied to clipboard

UNETR Number of FLOPs is mismatched with ptflops

Open Amshaker opened this issue 3 years ago • 0 comments

Hello,

The number of FLOPs for input size of 96x96x96 is written in Table. 5. in UNETR paper as 41.19 GFLOPS. However, I measured it using ptflops and it is 82.68 GMACS as follows:

macs, params = get_model_complexity_info(model, (1, 96, 96, 96), as_strings=True,
                                           print_per_layer_stat=True, verbose=True)

GFLOPs of UNETR, based on ptflops, is 82.68 GFLOPS. In the paper, it seems that you calculated the number of FLOPS as 0.5*82.68 ~41.19 GFLOPS.

When I calculate the FLOPs using fvcore as follows:

    input_res = (1, 96, 96, 96)
    input = torch.ones(()).new_empty((1, *input_res), dtype=next(model.parameters()).dtype, 
    device=next(model.parameters()).device)
    flops = FlopCountAnalysis(model, input)
    model_flops = flops.total()
    print(f"MAdds: {round(model_flops * 1e-6, 2)} M")

It gives 73.68G MAdds (FLOPs).

I am using the default hyperparameters in the readme file.

model = UNETR(
    in_channels=1,
    out_channels=14,
    img_size=(96, 96, 96),
    feature_size=16,
    hidden_size=768,
    mlp_dim=3072,
    num_heads=12,
    pos_embed='perceptron',
    norm_name='instance',
    conv_block=True,
    res_block=True,
    dropout_rate=0.0)

Could you please clarify that or release the code you used to calculate the FLOPs?

Thank you.

Amshaker avatar Jul 13 '22 13:07 Amshaker