Choosing the Sample with Lowest Loss makes SGD Robust

01/10/2020
by   Vatsal Shah, et al.
42

The presence of outliers can potentially significantly skew the parameters of machine learning models trained via stochastic gradient descent (SGD). In this paper we propose a simple variant of the simple SGD method: in each step, first choose a set of k samples, then from these choose the one with the smallest current loss, and do an SGD-like update with this chosen sample. Vanilla SGD corresponds to k = 1, i.e. no choice; k >= 2 represents a new algorithm that is however effectively minimizing a non-convex surrogate loss. Our main contribution is a theoretical analysis of the robustness properties of this idea for ML problems which are sums of convex losses; these are backed up with linear regression and small-scale neural network experiments

READ FULL TEXT
research
06/13/2022

On the Convergence to a Global Solution of Shuffling-Type Gradient Algorithms

Stochastic gradient descent (SGD) algorithm is the method of choice in m...
research
06/20/2019

Data Cleansing for Models Trained with SGD

Data cleansing is a typical approach used to improve the accuracy of mac...
research
10/13/2022

From Gradient Flow on Population Loss to Learning with Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) has been the method of choice for lear...
research
11/20/2020

Convergence Analysis of Homotopy-SGD for non-convex optimization

First-order stochastic methods for solving large-scale non-convex optimi...
research
07/01/2020

Online Robust Regression via SGD on the l1 loss

We consider the robust linear regression problem in the online setting w...
research
02/21/2020

Stein Self-Repulsive Dynamics: Benefits From Past Samples

We propose a new Stein self-repulsive dynamics for obtaining diversified...
research
07/16/2018

Evolving Differentiable Gene Regulatory Networks

Over the past twenty years, artificial Gene Regulatory Networks (GRNs) h...

Please sign up or login with your details

Forgot password? Click here to reset