Condensing Graphs via One-Step Gradient Matching

06/15/2022
by   Wei Jin, et al.
12

As training deep learning models on large dataset takes a lot of time and resources, it is desired to construct a small synthetic dataset with which we can train deep learning models sufficiently. There are recent works that have explored solutions on condensing image datasets through complex bi-level optimization. For instance, dataset condensation (DC) matches network gradients w.r.t. large-real data and small-synthetic data, where the network weights are optimized for multiple steps at each outer iteration. However, existing approaches have their inherent limitations: (1) they are not directly applicable to graphs where the data is discrete; and (2) the condensation process is computationally expensive due to the involved nested optimization. To bridge the gap, we investigate efficient dataset condensation tailored for graph datasets where we model the discrete graph structure as a probabilistic model. We further propose a one-step gradient matching scheme, which performs gradient matching for only one single step without training the network weights. Our theoretical analysis shows this strategy can generate synthetic graphs that lead to lower classification loss on real graphs. Extensive experiments on various graph datasets demonstrate the effectiveness and efficiency of the proposed method. In particular, we are able to reduce the dataset size by 90 and our method is significantly faster than multi-step gradient matching (e.g. 15x in CIFAR10 for synthesizing 500 graphs).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2022

Delving into Effective Gradient Matching for Dataset Condensation

As deep learning models and datasets rapidly scale up, network training ...
research
03/03/2022

CAFE: Learning to Condense Dataset by Aligning Features

Dataset condensation aims at reducing the network training effort throug...
research
10/08/2021

Dataset Condensation with Distribution Matching

Computational cost to train state-of-the-art deep models in many learnin...
research
06/18/2023

In-Process Global Interpretation for Graph Learning via Distribution Matching

Graphs neural networks (GNNs) have emerged as a powerful graph learning ...
research
05/30/2022

Dataset Condensation via Efficient Synthetic-Data Parameterization

The great success of machine learning with massive amounts of data comes...
research
07/19/2023

Improved Distribution Matching for Dataset Condensation

Dataset Condensation aims to condense a large dataset into a smaller one...
research
01/11/2018

A tool framework for tweaking features in synthetic datasets

Researchers and developers use benchmarks to compare their algorithms an...

Please sign up or login with your details

Forgot password? Click here to reset