TorchSSL icon indicating copy to clipboard operation
TorchSSL copied to clipboard

Bug in the code?

Open hzhz2020 opened this issue 2 years ago • 0 comments

Hi Thank you for this amazing repo. For the implementation of FlexMatch. you update the selected_label in this line selected_label[x_ulb_idx[select == 1]] = pseudo_lb[select == 1] https://github.com/TorchSSL/TorchSSL/blob/f26e1d42967cec7f7c8a00c2e7ff9219d8ab7c92/models/flexmatch/flexmatch.py#L181

where the indicator variable 'select' is returned from this line select = max_probs.ge(p_cutoff).long() https://github.com/TorchSSL/TorchSSL/blob/f26e1d42967cec7f7c8a00c2e7ff9219d8ab7c92/models/flexmatch/flexmatch_utils.py#L46C9-L46C47

But this is not using the flexible threshold in the paper, this is the regular FixMatch threshold. Why did you only use the flexible threshold to calculate loss, but not selecting instance for updating the select_labee?

hzhz2020 avatar Sep 01 '23 19:09 hzhz2020