-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Zhong_2025_CVPR, author = {Zhong, Xinhao and Fang, Hao and Chen, Bin and Gu, Xulin and Qiu, Meikang and Qi, Shuhan and Xia, Shu-Tao}, title = {Hierarchical Features Matter: A Deep Exploration of Progressive Parameterization Method for Dataset Distillation}, booktitle = {Proceedings of the Computer Vision and Pattern Recognition Conference (CVPR)}, month = {June}, year = {2025}, pages = {30462-30471} }
Hierarchical Features Matter: A Deep Exploration of Progressive Parameterization Method for Dataset Distillation
Abstract
Dataset distillation is an emerging dataset reduction method, which condenses large-scale datasets while maintaining task accuracy. Current parameterization methods achieve enhanced performance under extremely high compression ratio by optimizing determined synthetic dataset in informative feature domain. However, they limit themselves to a fixed optimization space for distillation, neglecting the diverse guidance across different informative latent spaces. To overcome this limitation, we propose a novel parameterization method dubbed Hierarchical Parameterization Distillation (H-PD), to systematically explore hierarchical feature within provided feature space (e.g., layers within pre-trained generative adversarial networks). We verify the correctness of our insights by applying the hierarchical optimization strategy on GAN-based parameterization method. In addition, we introduce a novel class-relevant feature distance metric to alleviate the computational burden associated with synthetic dataset evaluation, bridging the gap between synthetic and original datasets. Experimental results demonstrate that the proposed H-PD achieves a significant performance improvement under various settings with equivalent time consumption, and even surpasses current generative distillation using diffusion models under extreme compression ratios IPC=1 and IPC=10.
Related Material