DeepAI
Log In Sign Up

Learning performance in inverse Ising problems with sparse teacher couplings

12/25/2019
by   Alia Abbara, et al.
0

We investigate the learning performance of the pseudolikelihood maximization method for inverse Ising problems. In the teacher-student scenario under the assumption that the teacher's couplings are sparse and the student does not know the graphical structure, the learning curve and order parameters are assessed in the typical case using the replica and cavity methods from statistical mechanics. Our formulation is also applicable to a certain class of cost functions having locality; the standard likelihood does not belong to that class. The derived analytical formulas indicate that the perfect inference of the presence/absence of the teacher's couplings is possible in the thermodynamic limit taking the number of spins N as infinity while keeping the dataset size M proportional to N, as long as α=M/N > 2. Meanwhile, the formulas also show that the estimated coupling values corresponding to the truly existing ones in the teacher tend to be overestimated in the absolute value, manifesting the presence of estimation bias. These results are considered to be exact in the thermodynamic limit on locally tree-like networks, such as the regular random or Erdős–Rényi graphs. Numerical simulation results fully support the theoretical predictions. Additional biases in the estimators on loopy graphs are also discussed.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/16/2021

Locality defeats the curse of dimensionality in convolutional teacher-student scenarios

Convolutional neural networks perform a local and translationally-invari...
11/15/2021

A teacher-student framework for online correctional learning

A classical learning setting is one in which a student collects data, or...
12/17/2018

Learning Student Networks via Feature Embedding

Deep convolutional neural networks have been widely used in numerous app...
05/31/2019

Luck Matters: Understanding Training Dynamics of Deep ReLU Networks

We analyze the dynamics of training deep ReLU networks and their implica...
10/08/2022

Sparse Teachers Can Be Dense with Knowledge

Recent advances in distilling pretrained language models have discovered...
10/16/2019

Teacher algorithms for curriculum learning of Deep RL in continuously parameterized environments

We consider the problem of how a teacher algorithm can enable an unknown...