Dataset Distillation using Parameter Pruning

09/29/2022
by   Guang Li, et al.
0

The acquisition of advanced models relies on large datasets in many fields, which makes storing datasets and training models expensive. As a solution, dataset distillation can synthesize a small dataset such that models trained on it achieve high performance on par with the original large dataset. The recently proposed dataset distillation method by matching network parameters has been proved effective for several datasets. However, a few parameters in the distillation process are difficult to match, which harms the distillation performance. Based on this observation, this paper proposes a new method to solve the problem using parameter pruning. The proposed method can synthesize more robust distilled datasets and improve the distillation performance by pruning difficult-to-match parameters in the distillation process. Experimental results on three datasets show that the proposed method outperformed other SOTA dataset distillation methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset