DeepAI AI Chat
Log In Sign Up

Adaptive Second Order Coresets for Data-efficient Machine Learning

by   Omead Pooladzandi, et al.

Training machine learning models on massive datasets incurs substantial computational costs. To alleviate such costs, there has been a sustained effort to develop data-efficient training methods that can carefully select subsets of the training examples that generalize on par with the full training data. However, existing methods are limited in providing theoretical guarantees for the quality of the models trained on the extracted subsets, and may perform poorly in practice. We propose AdaCore, a method that leverages the geometry of the data to extract subsets of the training examples for efficient machine learning. The key idea behind our method is to dynamically approximate the curvature of the loss function via an exponentially-averaged estimate of the Hessian to select weighted subsets (coresets) that provide a close approximation of the full gradient preconditioned with the Hessian. We prove rigorous guarantees for the convergence of various first and second-order methods applied to the subsets chosen by AdaCore. Our extensive experiments show that AdaCore extracts coresets with higher quality compared to baselines and speeds up training of convex and non-convex machine learning models, such as logistic regression and neural networks, by over 2.9x over the full data and 4.5x over random subsets.


Data Sketching for Faster Training of Machine Learning Models

Many machine learning problems reduce to the problem of minimizing an ex...

Coresets for Robust Training of Neural Networks against Noisy Labels

Modern neural networks have the capacity to overfit noisy labels frequen...

Data Summarization via Bilevel Optimization

The increasing availability of massive data sets poses a series of chall...

SP2: A Second Order Stochastic Polyak Method

Recently the "SP" (Stochastic Polyak step size) method has emerged as a ...

Hessian Eigenspectra of More Realistic Nonlinear Models

Given an optimization problem, the Hessian matrix and its eigenspectrum ...

Metadata Archaeology: Unearthing Data Subsets by Leveraging Training Dynamics

Modern machine learning research relies on relatively few carefully cura...

Finding High-Value Training Data Subset through Differentiable Convex Programming

Finding valuable training data points for deep neural networks has been ...