Allies Teach Better than Enemies: Inverse Adversaries for Robust Knowledge Distillation
{{output}}
Adversarially robust knowledge distillation aims to compress a large-scale robust teacher model into a lightweight student counterpart while preserving adversarial robustness and natural performance. Previous methods primarily focused on aligning knowledge (e.... ...