Zeroth-Order Hard-Thresholding: Gradient Error vs. Expansivity

10/11/2022
by   William de Vazelhes, et al.
14

ℓ_0 constrained optimization is prevalent in machine learning, particularly for high-dimensional problems, because it is a fundamental approach to achieve sparse learning. Hard-thresholding gradient descent is a dominant technique to solve this problem. However, first-order gradients of the objective function may be either unavailable or expensive to calculate in a lot of real-world problems, where zeroth-order (ZO) gradients could be a good surrogate. Unfortunately, whether ZO gradients can work with the hard-thresholding operator is still an unsolved problem. To solve this puzzle, in this paper, we focus on the ℓ_0 constrained black-box stochastic optimization problems, and propose a new stochastic zeroth-order gradient hard-thresholding (SZOHT) algorithm with a general ZO gradient estimator powered by a novel random support sampling. We provide the convergence analysis of SZOHT under standard assumptions. Importantly, we reveal a conflict between the deviation of ZO estimators and the expansivity of the hard-thresholding operator, and provide a theoretical minimal value of the number of random directions in ZO gradients. In addition, we find that the query complexity of SZOHT is independent or weakly dependent on the dimensionality under different settings. Finally, we illustrate the utility of our method on a portfolio optimization problem as well as black-box adversarial attacks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2020

Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box Optimization Framework

In this work, we focus on the study of stochastic zeroth-order (ZO) opti...
research
09/30/2019

Min-Max Optimization without Gradients: Convergence and Applications to Adversarial ML

In this paper, we study the problem of constrained robust (min-max) opti...
research
05/20/2017

Calibrating Black Box Classification Models through the Thresholding Method

In high-dimensional classification settings, we wish to seek a balance b...
research
12/02/2019

Efficient Relaxed Gradient Support Pursuit for Sparsity Constrained Non-convex Optimization

Large-scale non-convex sparsity-constrained problems have recently gaine...
research
03/29/2020

Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling

We consider the problem of minimizing a high-dimensional objective funct...
research
10/11/2019

Improving Gradient Estimation in Evolutionary Strategies With Past Descent Directions

Evolutionary Strategies (ES) are known to be an effective black-box opti...
research
05/19/2021

Distributionally Constrained Black-Box Stochastic Gradient Estimation and Optimization

We consider stochastic gradient estimation using only black-box function...

Please sign up or login with your details

Forgot password? Click here to reset