Generalized Simultaneous Perturbation Stochastic Approximation with Reduced Estimator Bias

12/20/2022
by   Shalabh Bhatnagar, et al.
0

We present in this paper a family of generalized simultaneous perturbation stochastic approximation (G-SPSA) estimators that estimate the gradient of the objective using noisy function measurements, but where the number of function measurements and the form of the gradient estimator is guided by the desired estimator bias. In particular, estimators with more function measurements are seen to result in lower bias. We provide an analysis of convergence of the generalized SPSA algorithm, and point to possible future directions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2015

Gradient Estimation with Simultaneous Perturbation and Compressive Sensing

This paper aims at achieving a "good" estimator for the gradient of a fu...
research
05/29/2022

Stochastic Zeroth Order Gradient and Hessian Estimators: Variance Reduction and Refined Bias Bounds

We study stochastic zeroth order gradient and Hessian estimators for rea...
research
07/24/2021

Theoretical Study and Comparison of SPSA and RDSA Algorithms

Stochastic approximation (SA) algorithms are widely used in system optim...
research
05/14/2013

Estimating or Propagating Gradients Through Stochastic Neurons

Stochastic neurons can be useful for a number of reasons in deep learnin...
research
06/17/2021

Stochastic Bias-Reduced Gradient Methods

We develop a new primitive for stochastic optimization: a low-bias, low-...
research
08/18/2021

Geometry-informed irreversible perturbations for accelerated convergence of Langevin dynamics

We introduce a novel geometry-informed irreversible perturbation that ac...
research
07/21/2023

Improving Accuracy in Cell-Perturbation Experiments by Leveraging Auxiliary Information

Modern cell-perturbation experiments expose cells to panels of hundreds ...

Please sign up or login with your details

Forgot password? Click here to reset