wang21jun

Results 52 comments of wang21jun

Facing the same issue. Could you guys please fix this?

Please refer to "FaceX-Zoo/blob/main/training_mode/conventional_training/train_amp.py" for mixed precision training.

Yes, It's on the schedule, but not so fast. Very welcome to work with us to finish this function together.

In order to stay consistent with the original setting of SwinTransformer, we just resize the input image from 112*112 to 224*224.

If your dataset has only 8 classes, SST if not a good choice, just use conventional training.

Please refer to arcface(https://arxiv.org/pdf/1801.07698v1.pdf),sec. 3.2.2

Yes, you should align all faces to 112*112. Please align faces by the function of 'norm_crop' in 'https://github.com/deepinsight/insightface/blob/master/recognition/common/face_align.py'

你好,队列不是越大越好也不是越小越好.16384是一个比较折中的大小. 队列太大(比如用全量ID)会导致队列里存有大量太老的特征,因为每次iter只更新队列里面一个batch的特征;队列太小会导致负例太少.

msceleb_deepglint_train_file.txt 和MS-Celeb-1M-v1c-r-shallow_train_list.txt 是一个文件,改了一下名。精度上可以再调一下超参。

可以,但是这样训练出来的模型作为预训练模型没问题,最好在有监督信息的训练集上finetune后再使用。