self-knowledge-distillation topic

List self-knowledge-distillation repositories

cls_KD

203
Stars
16
Forks
Watchers

'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)

Self-KD-Lib

102
Stars
11
Forks
Watchers

[ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of some Self-Knowledge Distillation and data augmentation methods

WKD

48
Stars
3
Forks
48
Watchers

The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://arxiv.org/abs/2412.08139