Representation learnt by SGD and Adaptive learning rules – Conditions that Vary Sparsity and Selectivity in Neural Network

01/25/2022
by   Jinhyun Park, et al.
0

From the point of view of the human brain, continual learning can perform various tasks without mutual interference. An effective way to reduce mutual interference can be found in sparsity and selectivity of neurons. According to Aljundi et al. and Hadsell et al., imposing sparsity at the representational level is advantageous for continual learning because sparse neuronal activations encourage less overlap between parameters, resulting in less interference. Similarly, highly selective neural networks are likely to induce less interference since particular response in neurons will reduce the chance of overlap with other parameters. Considering that the human brain performs continual learning over the lifespan, finding conditions where sparsity and selectivity naturally arises may provide insight for understanding how the brain functions. This paper investigates various conditions that naturally increase sparsity and selectivity in a neural network. This paper tested different optimizers with Hoyer's sparsity metric and CCMAS selectivity metric in MNIST classification task. It is essential to note that investigations on the natural occurrence of sparsity and selectivity concerning various conditions have not been acknowledged in any sector of neuroscience nor machine learning until this day. This paper found that particular conditions increase sparsity and selectivity such as applying a large learning rate and lowering a batch size. In addition to the relationship between the condition, sparsity, and selectivity, the following will be discussed based on empirical analysis: 1. The relationship between sparsity and selectivity and 2. The relationship between test accuracy, sparsity, and selectivity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2022

Learning Bayesian Sparse Networks with Full Experience Replay for Continual Learning

Continual Learning (CL) methods aim to enable machine learning models to...
research
01/02/2023

Dynamically Modular and Sparse General Continual Learning

Real-world applications often require learning continuously from a strea...
research
10/27/2021

Learning where to learn: Gradient sparsity in meta and continual learning

Finding neural network weights that generalize well from small datasets ...
research
02/27/2022

Robust Continual Learning through a Comprehensively Progressive Bayesian Neural Network

This work proposes a comprehensively progressive Bayesian neural network...
research
10/26/2021

Brain-inspired feature exaggeration in generative replay for continual learning

The catastrophic forgetting of previously learnt classes is one of the m...
research
08/28/2023

Continual Learning with Dynamic Sparse Training: Exploring Algorithms for Effective Model Updates

Continual learning (CL) refers to the ability of an intelligent system t...
research
09/29/2018

Continuous Learning of Context-dependent Processing in Neural Networks

Deep artificial neural networks (DNNs) are powerful tools for recognitio...

Please sign up or login with your details

Forgot password? Click here to reset