Stochastic Non-convex Ordinal Embedding with Stabilized Barzilai-Borwein Step Size

11/17/2017
by   Ke Ma, et al.
0

Learning representation from relative similarity comparisons, often called ordinal embedding, gains rising attention in recent years. Most of the existing methods are batch methods designed mainly based on the convex optimization, say, the projected gradient descent method. However, they are generally time-consuming due to that the singular value decomposition (SVD) is commonly adopted during the update, especially when the data size is very large. To overcome this challenge, we propose a stochastic algorithm called SVRG-SBB, which has the following features: (a) SVD-free via dropping convexity, with good scalability by the use of stochastic algorithm, i.e., stochastic variance reduced gradient (SVRG), and (b) adaptive step size choice via introducing a new stabilized Barzilai-Borwein (SBB) method as the original version for convex problems might fail for the considered stochastic non-convex optimization problem. Moreover, we show that the proposed algorithm converges to a stationary point at a rate O(1/T) in our setting, where T is the number of total iterations. Numerous simulations and real-world data experiments are conducted to show the effectiveness of the proposed algorithm via comparing with the state-of-the-art methods, particularly, much lower computational cost with good prediction performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2019

Fast Stochastic Ordinal Embedding with Variance Reduction and Adaptive Step Size

Learning representation from relative similarity comparisons, often call...
research
07/03/2020

Variance reduction for Riemannian non-convex optimization with batch size adaptation

Variance reduction techniques are popular in accelerating gradient desce...
research
04/12/2022

An Adaptive Time Stepping Scheme for Rate-Independent Systems with Non-Convex Energy

We investigate a local incremental stationary scheme for the numerical s...
research
06/20/2019

Accelerating Mini-batch SARAH by Step Size Rules

StochAstic Recursive grAdient algoritHm (SARAH), originally proposed for...
research
08/20/2018

Universal Stagewise Learning for Non-Convex Problems with Convergence on Averaged Solutions

Although stochastic gradient descent () method and its variants (e.g., s...
research
05/24/2022

Weak Convergence of Approximate reflection coupling and its Application to Non-convex Optimization

In this paper, we propose a weak approximation of the reflection couplin...
research
10/10/2019

One Sample Stochastic Frank-Wolfe

One of the beauties of the projected gradient descent method lies in its...

Please sign up or login with your details

Forgot password? Click here to reset