Continual learning enables the incremental training of machine learning
...
Hyperparameter optimization (HPO) and neural architecture search (NAS) a...
The goal of continual learning (CL) is to efficiently update a machine
l...
We devise a coreset selection method based on the idea of gradient match...
Standard first-order stochastic optimization algorithms base their updat...
Natural gradient descent, which preconditions a gradient descent update ...
Because the choice and tuning of the optimizer affects the speed, and
ul...
Stochastic noise on gradients is now a common feature in machine learnin...
Early stopping is a widely used technique to prevent poor generalization...
Mini-batch stochastic gradient descent and variants thereof have become
...