Dataset Condensation with Contrastive Signals

02/07/2022
by   Saehyung Lee, et al.
0

Recent studies have demonstrated that gradient matching-based dataset synthesis, or dataset condensation (DC), methods can achieve state-of-the-art performance when applied to data-efficient learning tasks. However, in this study, we prove that the existing DC methods can perform worse than the random selection method when task-irrelevant information forms a significant part of the training dataset. We attribute this to the lack of participation of the contrastive signals between the classes resulting from the class-wise gradient matching strategy. To address this problem, we propose Dataset Condensation with Contrastive signals (DCC) by modifying the loss function to enable the DC methods to effectively capture the differences between classes. In addition, we analyze the new loss function in terms of training dynamics by tracking the kernel velocity. Furthermore, we introduce a bi-level warm-up strategy to stabilize the optimization. Our experimental results indicate that while the existing methods are ineffective for fine-grained image classification tasks, the proposed method can successfully generate informative synthetic datasets for the same tasks. Moreover, we demonstrate that the proposed method outperforms the baselines even on benchmark datasets such as SVHN, CIFAR-10, and CIFAR-100. Finally, we demonstrate the high applicability of the proposed method by applying it to continual learning tasks.

READ FULL TEXT

page 1

page 4

page 6

research
06/04/2023

Towards Robust Feature Learning with t-vFM Similarity for Continual Learning

Continual learning has been developed using standard supervised contrast...
research
08/14/2023

Contrastive Bi-Projector for Unsupervised Domain Adaption

This paper proposes a novel unsupervised domain adaption (UDA) method ba...
research
12/03/2021

Contrastive Continual Learning with Feature Propagation

Classical machine learners are designed only to tackle one task without ...
research
11/10/2022

Mitigating Forgetting in Online Continual Learning via Contrasting Semantically Distinct Augmentations

Online continual learning (OCL) aims to enable model learning from a non...
research
06/04/2021

Manifold-Aware Deep Clustering: Maximizing Angles between Embedding Vectors Based on Regular Simplex

This paper presents a new deep clustering (DC) method called manifold-aw...
research
03/20/2023

Constructing Bayesian Pseudo-Coresets using Contrastive Divergence

Bayesian Pseudo-Coreset (BPC) and Dataset Condensation are two parallel ...
research
07/29/2023

Continual Learning in Predictive Autoscaling

Predictive Autoscaling is used to forecast the workloads of servers and ...

Please sign up or login with your details

Forgot password? Click here to reset