Landscape of Sparse Linear Network: A Brief Investigation

09/16/2020
by   Dachao Lin, et al.
1

Network pruning, or sparse network has a long history and practical significance in modern application. A major concern for neural network training is that the non-convexity of the associated loss functions may cause bad landscape. We focus on analyzing sparse linear network generated from weight pruning strategy. With no unrealistic assumption, we prove the following statements for the squared loss objective of sparse linear neural networks: 1) every local minimum is a global minimum for scalar output with any sparse structure, or non-intersect sparse first layer and dense other layers with whitened data; 2) sparse linear networks have sub-optimal local-min for only sparse first layer or three target dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2020

The Global Landscape of Neural Networks: An Overview

One of the major concerns for neural network training is that the non-co...
research
06/13/2018

Weight Initialization without Local Minima in Deep Nonlinear Neural Networks

In this paper, we propose a new weight initialization method called even...
research
05/22/2018

Adding One Neuron Can Eliminate All Bad Local Minima

One of the main difficulties in analyzing neural networks is the non-con...
research
06/25/2019

The Difficulty of Training Sparse Neural Networks

We investigate the difficulties of training sparse neural networks and m...
research
06/10/2020

All Local Minima are Global for Two-Layer ReLU Neural Networks: The Hidden Convex Optimization Landscape

We are interested in two-layer ReLU neural networks from an optimization...
research
11/10/2020

Towards a Better Global Loss Landscape of GANs

Understanding of GAN training is still very limited. One major challenge...
research
04/18/2018

Are ResNets Provably Better than Linear Predictors?

A residual network (or ResNet) is a standard deep neural net architectur...

Please sign up or login with your details

Forgot password? Click here to reset