The Role of Permutation Invariance in Linear Mode Connectivity of Neural Networks

10/12/2021
by   Rahim Entezari, et al.
0

In this paper, we conjecture that if the permutation invariance of neural networks is taken into account, SGD solutions will likely have no barrier in the linear interpolation between them. Although it is a bold conjecture, we show how extensive empirical attempts fall short of refuting it. We further provide a preliminary theoretical result to support our conjecture. Our conjecture has implications for lottery ticket hypothesis, distributed training, and ensemble methods.

READ FULL TEXT

page 2

page 6

page 22

research
11/15/2022

REPAIR: REnormalizing Permuted Activations for Interpolation Repair

In this paper we look into the conjecture of Entezari et al.(2021) which...
research
09/08/2018

A conjecture on permutation trinomials over finite fields of characteristic two

In this paper, by analyzing the quadratic factors of an 11-th degree pol...
research
05/26/2023

Investigating how ReLU-networks encode symmetries

Many data symmetries can be described in terms of group equivariance and...
research
09/15/2022

Random initialisations performing above chance and how to find them

Neural networks trained with stochastic gradient descent (SGD) starting ...
research
10/13/2022

Wasserstein Barycenter-based Model Fusion and Linear Mode Connectivity of Neural Networks

Based on the concepts of Wasserstein barycenter (WB) and Gromov-Wasserst...
research
09/25/2022

Burstein's permutation conjecture, Hong and Li's inversion sequence conjecture, and restricted Eulerian distributions

Recently, Hong and Li launched a systematic study of length-four pattern...
research
05/18/2015

Fractally-organized Connectionist Networks: Conjectures and Preliminary Results

A strict interpretation of connectionism mandates complex networks of si...

Please sign up or login with your details

Forgot password? Click here to reset