Dataset Distillation with Infinitely Wide Convolutional Networks

07/27/2021
by   Timothy Nguyen, et al.
0

The effectiveness of machine learning algorithms arises from being able to extract useful features from large amounts of data. As model and dataset sizes increase, dataset distillation methods that compress large datasets into significantly smaller yet highly performant ones will become valuable in terms of training efficiency and useful feature extraction. To that end, we apply a novel distributed kernel based meta-learning framework to achieve state-of-the-art results for dataset distillation using infinitely wide convolutional neural networks. For instance, using only 10 datapoints (0.02 original dataset), we obtain over 64 classification task, a dramatic improvement over the previous best test accuracy of 40 for MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100, and SVHN. Furthermore, we perform some preliminary analyses of our distilled datasets to shed light on how they differ from naturally occurring data.

READ FULL TEXT

page 7

page 8

page 17

research
10/30/2020

Dataset Meta-Learning from Kernel Ridge-Regression

One of the most fundamental aspects of any machine learning algorithm is...
research
03/08/2023

DiM: Distilling Dataset into Generative Model

Dataset distillation reduces the network training cost by synthesizing s...
research
05/10/2021

Examining and Mitigating Kernel Saturation in Convolutional Neural Networks using Negative Images

Neural saturation in Deep Neural Networks (DNNs) has been studied extens...
research
06/06/2022

Remember the Past: Distilling Datasets into Addressable Memories for Neural Networks

We propose an algorithm that compresses the critical information of a la...
research
04/17/2021

Data Distillation for Text Classification

Deep learning techniques have achieved great success in many fields, whi...
research
09/09/2023

TMComposites: Plug-and-Play Collaboration Between Specialized Tsetlin Machines

Tsetlin Machines (TMs) provide a fundamental shift from arithmetic-based...
research
10/22/2020

A Joint Learning Approach based on Self-Distillation for Keyphrase Extraction from Scientific Documents

Keyphrase extraction is the task of extracting a small set of phrases th...

Please sign up or login with your details

Forgot password? Click here to reset