Cost-Sensitive Approach to Batch Size Adaptation for Gradient Descent

12/09/2017
by   Matteo Pirotta, et al.
0

In this paper, we propose a novel approach to automatically determine the batch size in stochastic gradient descent methods. The choice of the batch size induces a trade-off between the accuracy of the gradient estimate and the cost in terms of samples of each update. We propose to determine the batch size by optimizing the ratio between a lower bound to a linear or quadratic Taylor approximation of the expected improvement and the number of samples used to estimate the gradient. The performance of the proposed approach is empirically compared with related methods on popular classification tasks. The work was presented at the NIPS workshop on Optimizing the Optimizers. Barcelona, Spain, 2016.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2020

Variance reduction for Riemannian non-convex optimization with batch size adaptation

Variance reduction techniques are popular in accelerating gradient desce...
research
12/02/2019

Risk Bounds for Low Cost Bipartite Ranking

Bipartite ranking is an important supervised learning problem; however, ...
research
10/18/2019

Improving the convergence of SGD through adaptive batch sizes

Mini-batch stochastic gradient descent (SGD) approximates the gradient o...
research
02/05/2016

Reducing Runtime by Recycling Samples

Contrary to the situation with stochastic gradient descent, we argue tha...
research
12/05/2022

Distributed Stochastic Gradient Descent with Cost-Sensitive and Strategic Agents

This study considers a federated learning setup where cost-sensitive and...
research
06/25/2018

Stochastic natural gradient descent draws posterior samples in function space

Natural gradient descent (NGD) minimises the cost function on a Riemanni...
research
10/13/2021

Seismic Tomography with Random Batch Gradient Reconstruction

Seismic tomography solves high-dimensional optimization problems to imag...

Please sign up or login with your details

Forgot password? Click here to reset