Zhihao Chen
Zhihao Chen
Actually, in our implementation, we use conv3x3, batchnorm2d, relu, dropout(0.1) and conv1x1. We simplify the writing in our paper. Sorry for taking some confusion with you.
Fine, I will upload it within one week.
We use this function('utils.util.cal_subitizing') to generate the count labels rather than manual label image by image.
ok, I will
I upload the unlabeled data collected from the internet. At the same time, I also upload USR dataset for your convenience.
Yes, ResNeXt101_fork is not necessary, it is the history manuscript during experiments. I have removed it already, thank you.
With regard to the usefulness of unlabeled data, you can read this paper "A brief introduction to weakly supervised learning". Figure 3 in this paper maybe give you some enlightening.
A type of semi-supervised learning method is using consistency losses between multi-models. In this way, you can consider it a regularization method. For some evidence, you can read "Mean teachers...
ok. we will update. And now, you can delete those lines for quick use.
I update some files. The old are ablation version and i uploaded wrong. Now it is the final version.