Clustering of solutions in the symmetric binary perceptron

11/15/2019
by   Carlo Baldassi, et al.
0

The geometrical features of the (non-convex) loss landscape of neural network models are crucial in ensuring successful optimization and, most importantly, the capability to generalize well. While minimizers' flatness consistently correlates with good generalization, there has been little rigorous work in exploring the condition of existence of such minimizers, even in toy models. Here we consider a simple neural network model, the symmetric perceptron, with binary weights. Phrasing the learning problem as a constraint satisfaction problem, the analogous of a flat minimizer becomes a large and dense cluster of solutions, while the narrowest minimizers are isolated solutions. We perform the first steps toward the rigorous proof of the existence of a dense cluster in certain regimes of the parameters, by computing the first and second moment upper bounds for the existence of pairs of arbitrarily close solutions. Moreover, we present a non rigorous derivation of the same bounds for sets of y solutions at fixed pairwise distances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/04/2021

Binary perceptron: efficient algorithms can find solutions in a rare well-connected cluster

It was recently shown that almost all solutions in the symmetric binary ...
research
10/26/2017

On the role of synaptic stochasticity in training low-precision neural networks

Stochasticity and limited precision of synaptic weights in neural networ...
research
11/26/2021

Equivalence between algorithmic instability and transition to replica symmetry breaking in perceptron learning systems

Binary perceptron is a fundamental model of supervised learning for the ...
research
12/04/2020

Non-Asymptotic Analysis of Excess Risk via Empirical Risk Landscape

In this paper, we provide a unified analysis of the excess risk of the m...
research
11/18/2015

Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems

We introduce a novel Entropy-driven Monte Carlo (EdMC) strategy to effic...
research
03/29/2022

Algorithms and Barriers in the Symmetric Binary Perceptron Model

The symmetric binary perceptron () exhibits a dramatic statistical-to-co...
research
09/18/2015

Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses

We show that discrete synaptic weights can be efficiently used for learn...

Please sign up or login with your details

Forgot password? Click here to reset