Top-1 Loc, Top-5 Loc and backbone( DenseNet161 and EfficientNet-B7 )
Hello! First of all, you did a great job! Congratulations! Second, I have three questions I'd like to ask you about running the code.
-
First, in the case of WSOL, I didn't find the test file test.py in the code. Is the result on the CUB-200-2011 test set and imagenet-1k validation set mentioned in this paper calculated by def test in train_ccam_cub. py?

-
Secondly, the three evaluation Top-1 Loc, Top-5 Loc mentioned in the paper were not found in the code to calculate Top-1 Loc and Top-5 Loc, so there was no Top-1 Loc and Top-5 Loc in the running result of my CUB-200-2011 dataset. How to solve this problem?

-
Thirdly, the C2AM(Ours) in Table1 uses backbone DenseNet161 and Efficientnet-B7. I did not find the part of the code downloading and replacing these two backbone networks in the code, so that the results in C2AM(Ours) in Table1 could not be reproduced. How to solve this problem?

Hi,
- This repository only provide the training code for CCAM (train_CCAM_CUB. py/train_CCAM_ILSVRC. py and test codes were included in them)
- After training of CCAM, you can extract class-agnostic bouding boxes from CCAM and re-train a regressor using these pseudo bboxes. Please refer to sec 3.4. in our paper.
- You can adopt DenseNet as the backbone for localization and efficientnet for classification (just following other works' setting like PSOL). Code was provided at here. I will try my best to open my code in this repository for re-training.
- Btw, the simplest method to get top-1 loc and top-5 loc is to directly combine class-agnostic bboxes estimated from CCAM and classification from a classifer. Code for top-1 and top-5 loc you can refer to my another repository https://github.com/Sierkinhane/ORNet/blob/59e8ef5e461a5b00ca8c94d9fa0f5e58df193774/train_2nd_step.py#L225
BTW, welcome to star our project!
Thanks for your reply! I would like to confirm the following questions with you again. For the second question, I need to manually add the top-1 and top-5 code to the source code, right? For the third question, I need to refer to PSOL to modify and add the source code to re-train and achieve the following result reproduction, right?

I really appreciate your work, hope you can open your re-training code in this repository, and I have strat for your work!
Exactly.
For convenience, I will try to clean and upload my codes in the coming weeks!
Ok, thank you very much for your reply!
Hello, I just clean and upload the code to train a regressor using the extracted bboxes. Please check that and feel free to ask any questions!
Ok, I'll download the code and take a look. Thank you for your patient and careful reply! Your work is really great! If there are new questions in the future, I hope to communicate with you through the platform! Thank you!
Hello, I just clean and upload the code to train a regressor using the extracted bboxes. Please check that and feel free to ask any questions!
Hello, I would like to know the configuration of your WSOL and WSSS experiments, as well as the epochs when training WSOL on CUB_200_2011 dataset and WSSS on VOC2012.
Hello, default configurations can be found in xxx.yaml or arguments in the code.
Sorry, after running the three steps of the PY file according to README in WSSS, the background clue I get is like this, I can't see the outline of the target at all, it is almost completely black. But I follow the README operation, only changed the path of the dataset and the name of the model, I don't know what is wrong?😭
Please help me!😭

Have you loaded the pretrained moco or detco?
When runing the experiments, you can observe the generated activation map in experiments/images/.
├── experiments/
| ├── checkpoints
| ├—— images
Hello, default configurations can be found in xxx.yaml or arguments in the code.
Hello, I see that the loss value of ten epochs is still decreasing. Should we increase the training epoch? As for the background clue being all black, I may need some more time to experiment.
I just run the code and it is no problem in my experiments. How about your configurations? Is same with this repository?
Please ensure the batch size is larger than 32.
- Btw, the simplest method to get top-1 loc and top-5 loc is to directly combine class-agnostic bboxes estimated from CCAM and classification from a classifer. Code for top-1 and top-5 loc you can refer to my another repository https://github.com/Sierkinhane/ORNet/blob/59e8ef5e461a5b00ca8c94d9fa0f5e58df193774/train_2nd_step.py#L225
Hello,I'm confused about how to calculate top-1 loc and top-5 loc on top of class-agnostic maps since class label is unknown, could you explained classifer in detail?For example,whether the classifer is directly bulit on the class-agnostic maps? And whether the classifier needs to be supervised?
Hi, just combine the bbox with a classification result. We provide details in method section.