Minimax Efficient Finite-Difference Stochastic Gradient Estimators Using Black-Box Function Evaluations

07/08/2020
by   Henry Lam, et al.
0

We consider stochastic gradient estimation using noisy black-box function evaluations. A standard approach is to use the finite-difference method or its variants. While natural, it is open to our knowledge whether its statistical accuracy is the best possible. This paper argues so by showing that central finite-difference is a nearly minimax optimal zeroth-order gradient estimator, among both the class of linear estimators and the much larger class of all (nonlinear) estimators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2021

Distributionally Constrained Black-Box Stochastic Gradient Estimation and Optimization

We consider stochastic gradient estimation using only black-box function...
research
05/15/2018

On the Application of Danskin's Theorem to Derivative-Free Minimax Optimization

Motivated by Danskin's theorem, gradient-based methods have been applied...
research
05/06/2023

The Fundamental Limits of Structure-Agnostic Functional Estimation

Many recent developments in causal inference, and functional estimation ...
research
07/15/2021

FastSHAP: Real-Time Shapley Value Estimation

Shapley values are widely used to explain black-box models, but they are...
research
10/14/2022

A Scalable Finite Difference Method for Deep Reinforcement Learning

Several low-bandwidth distributable black-box optimization algorithms ha...
research
07/26/2019

Sequential Learning of Active Subspaces

In recent years, active subspace methods (ASMs) have become a popular me...
research
04/13/2023

Sample Average Approximation for Black-Box VI

We present a novel approach for black-box VI that bypasses the difficult...

Please sign up or login with your details

Forgot password? Click here to reset