Stochastic Gradient Descent Works Really Well for Stress Minimization

08/24/2020
by   Katharina Börsig, et al.
0

Stress minimization is among the best studied force-directed graph layout methods because it reliably yields high-quality layouts. It thus comes as a surprise that a novel approach based on stochastic gradient descent (Zheng, Pawar and Goodman, TVCG 2019) is claimed to improve on state-of-the-art approaches based on majorization. We present experimental evidence that the new approach does not actually yield better layouts, but that it is still to be preferred because it is simpler and robust against poor initialization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2017

Graph Drawing by Stochastic Gradient Descent

A popular method of force-directed graph drawing is multidimensional sca...
research
08/19/2022

FORBID: Fast Overlap Removal By stochastic gradIent Descent for Graph Drawing

While many graph drawing algorithms consider nodes as points, graph visu...
research
05/22/2018

Efficient Stochastic Gradient Descent for Distributionally Robust Learning

We consider a new stochastic gradient descent algorithm for efficiently ...
research
11/05/2022

Stochastic Variance Reduced Gradient for affine rank minimization problem

We develop an efficient stochastic variance reduced gradient descent alg...
research
12/22/2020

Unbiased Gradient Estimation for Distributionally Robust Learning

Seeking to improve model generalization, we consider a new approach base...
research
07/01/2022

Analysis of Kinetic Models for Label Switching and Stochastic Gradient Descent

In this paper we provide a novel approach to the analysis of kinetic mod...
research
12/02/2021

Multicriteria Scalable Graph Drawing via Stochastic Gradient Descent, (SGD)^2

Readability criteria, such as distance or neighborhood preservation, are...

Please sign up or login with your details

Forgot password? Click here to reset