Soft Label Pruning and Quantization for Large-Scale Dataset Distillation
{{output}}
Large-scale dataset distillation requires storing auxiliary soft labels that can be 30-40× (ImageNet-1K) or 200× (ImageNet-21K) larger than the condensed images, undermining the goal of dataset compression. We identify two fundamental issues necessitating su... ...