Distributed Black-Box Optimization via Error Correcting Codes

07/13/2019
by   Burak Bartan, et al.
0

We introduce a novel distributed derivative-free optimization framework that is resilient to stragglers. The proposed method employs coded search directions at which the objective function is evaluated, and a decoding step to find the next iterate. Our framework can be seen as an extension of evolution strategies and structured exploration methods where structured search directions were utilized. As an application, we consider black-box adversarial attacks on deep convolutional neural networks. Our numerical experiments demonstrate a significant improvement in the computation times.

READ FULL TEXT
research
10/05/2019

Yet another but more efficient black-box adversarial attack: tiling and evolution strategies

We introduce a new black-box attack achieving state of the art performan...
research
07/20/2018

Prior Convictions: Black-Box Adversarial Attacks with Bandits and Priors

We introduce a framework that unifies the existing work on black-box adv...
research
12/29/2018

Hessian-Aware Zeroth-Order Optimization for Black-Box Adversarial Attack

Zeroth-order optimization or derivative-free optimization is an importan...
research
01/08/2020

Explainable Deep Convolutional Candlestick Learner

Candlesticks are graphical representations of price movements for a give...
research
06/02/2020

Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization

Interest in stochastic zeroth-order (SZO) methods has recently been revi...
research
04/09/2022

Distributed Evolution Strategies for Black-box Stochastic Optimization

This work concerns the evolutionary approaches to distributed stochastic...
research
10/04/2020

Multi-Level Evolution Strategies for High-Resolution Black-Box Control

This paper introduces a multi-level (m-lev) mechanism into Evolution Str...

Please sign up or login with your details

Forgot password? Click here to reset