shlrao

Results 2 comments of shlrao

> I have two questions about FLOPs. > > (1) [here](https://github.com/microsoft/Swin-Transformer/blob/main/models/swin_transformer.py#L583) `flops += self.num_features * self.patches_resolution[0] * self.patches_resolution[1] // (2 ** self.num_layers)` I think it is the FLOPs for norm....

> Currently, only num_bits=8 is supported. Do we have a plan of support? Because some networks require support for different bit widths at times.