DeepAI AI Chat
Log In Sign Up

Convexification of Neural Graph

01/09/2018
by   Han Xiao, et al.
Apple, Inc.
0

Traditionally, most complex intelligence architectures are extremely non-convex, which could not be well performed by convex optimization. However, this paper decomposes complex structures into three types of nodes: operators, algorithms and functions. Further, iteratively propagating from node to node along edge, we prove that "regarding the neural graph without triangles, it is nearly convex in each variable, when the other variables are fixed." In fact, the non-convex properties stem from triangles and functions, which could be transformed to be convex with our proposed convexification inequality. In conclusion, we generally depict the landscape for the objective of neural graph and propose the methodology to convexify neural graph.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/29/2018

Global Non-convex Optimization with Discretized Diffusions

An Euler discretization of the Langevin diffusion is known to converge t...
02/28/2020

First Order Methods take Exponential Time to Converge to Global Minimizers of Non-Convex Functions

Machine learning algorithms typically perform optimization over a class ...
06/22/2020

Graph Learning for Inverse Landscape Genetics

The problem of inferring unknown graph edges from numerical data at a gr...
12/10/2019

Byzantine Resilient Non-Convex SVRG with Distributed Batch Gradient Computations

In this work, we consider the distributed stochastic optimization proble...
10/02/2014

Mapping Energy Landscapes of Non-Convex Learning Problems

In many statistical learning problems, the target functions to be optimi...
05/26/2019

A Hybrid Algorithm for Metaheuristic Optimization

We propose a novel, flexible algorithm for combining together metaheuris...