Polarity is all you need to learn and transfer faster

03/29/2023
by   Qingyang Wang, et al.
0

Natural intelligences (NIs) thrive in a dynamic world - they learn quickly, sometimes with only a few samples. In contrast, Artificial intelligences (AIs) typically learn with prohibitive amount of training samples and computational power. What design principle difference between NI and AI could contribute to such a discrepancy? Here, we propose an angle from weight polarity: development processes initialize NIs with advantageous polarity configurations; as NIs grow and learn, synapse magnitudes update yet polarities are largely kept unchanged. We demonstrate with simulation and image classification tasks that if weight polarities are adequately set a priori, then networks learn with less time and data. We also explicitly illustrate situations in which a priori setting the weight polarities is disadvantageous for networks. Our work illustrates the value of weight polarities from the perspective of statistical and computational efficiency during learning.

READ FULL TEXT

page 1

page 8

research
09/30/2019

Blessing of dimensionality at the edge

In this paper we present theory and algorithms enabling classes of Artif...
research
09/17/2022

Learning to Weight Samples for Dynamic Early-exiting Networks

Early exiting is an effective paradigm for improving the inference effic...
research
05/18/2020

Data Represention for Deep Learning with Priori Knowledge of Symmetric Wireless Tasks

Deep neural networks (DNNs) have been applied to address various wireles...
research
02/07/2022

Towards an Analytical Definition of Sufficient Data

We show that, for each of five datasets of increasing complexity, certai...
research
07/04/2023

Deconstructing Data Reconstruction: Multiclass, Weight Decay and General Losses

Memorization of training data is an active research area, yet our unders...
research
02/08/2017

Exploiting Domain Knowledge via Grouped Weight Sharing with Application to Text Categorization

A fundamental advantage of neural models for NLP is their ability to lea...
research
10/07/2022

Images as Weight Matrices: Sequential Image Generation Through Synaptic Learning Rules

Work on fast weight programmers has demonstrated the effectiveness of ke...

Please sign up or login with your details

Forgot password? Click here to reset