Follow the Signs for Robust Stochastic Optimization
Stochastic noise on gradients is now a common feature in machine learning. It complicates the design of optimization algorithms, and its effect can be unintuitive: We show that in some settings, particularly those of low signal-to-noise ratio, it can be helpful to discard all but the signs of stochastic gradient elements. In fact, we argue that three popular existing methods already approximate this very paradigm. We devise novel stochastic optimization algorithms that explicitly follow stochastic sign estimates while appropriately accounting for their uncertainty. These methods favorably compare to the state of the art on a number of benchmark problems.
READ FULL TEXT