Transformations between deep neural networks

07/10/2020
by   Tom Bertalan, et al.
0

We propose to test, and when possible establish, an equivalence between two different artificial neural networks by attempting to construct a data-driven transformation between them, using manifold-learning techniques. In particular, we employ diffusion maps with a Mahalanobis-like metric. If the construction succeeds, the two networks can be thought of as belonging to the same equivalence class. We first discuss transformation functions between only the outputs of the two networks; we then also consider transformations that take into account outputs (activations) of a number of internal neurons from each network. In general, Whitney's theorem dictates the number of measurements from one of the networks required to reconstruct each and every feature of the second network. The construction of the transformation function relies on a consistent, intrinsic representation of the network input space. We illustrate our algorithm by matching neural network pairs trained to learn (a) observations of scalar functions; (b) observations of two-dimensional vector fields; and (c) representations of images of a moving three-dimensional object (a rotating horse). The construction of such equivalence classes across different network instantiations clearly relates to transfer learning. We also expect that it will be valuable in establishing equivalence between different Machine Learning-based models of the same phenomenon observed through different instruments and by different research groups.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2018

Knowledge Transfer with Jacobian Matching

Classical distillation methods transfer representations from a "teacher"...
research
12/17/2021

A singular Riemannian geometry approach to Deep Neural Networks II. Reconstruction of 1-D equivalence classes

In a previous work, we proposed a geometric framework to study a deep ne...
research
07/26/2018

Aggregated Learning: A Vector Quantization Approach to Learning with Neural Networks

We establish an equivalence between information bottleneck (IB) learning...
research
10/23/2020

On the Number of Linear Functions Composing Deep Neural Network: Towards a Refined Definition of Neural Networks Complexity

The classical approach to measure the expressive power of deep neural ne...
research
05/08/2023

Functional Equivalence and Path Connectivity of Reducible Hyperbolic Tangent Networks

Understanding the learning process of artificial neural networks require...
research
08/21/2016

Neural Networks and Chaos: Construction, Evaluation of Chaotic Networks, and Prediction of Chaos with Multilayer Feedforward Networks

Many research works deal with chaotic neural networks for various fields...
research
03/12/2019

Paradox in Deep Neural Networks: Similar yet Different while Different yet Similar

Machine learning is advancing towards a data-science approach, implying ...

Please sign up or login with your details

Forgot password? Click here to reset