Need is All You Need: Homeostatic Neural Networks Adapt to Concept Shift

05/17/2022
by   Kingson Man, et al.
12

In living organisms, homeostasis is the natural regulation of internal states aimed at maintaining conditions compatible with life. Typical artificial systems are not equipped with comparable regulatory features. Here, we introduce an artificial neural network that incorporates homeostatic features. Its own computing substrate is placed in a needful and vulnerable relation to the very objects over which it computes. For example, artificial neurons performing classification of MNIST digits or Fashion-MNIST articles of clothing may receive excitatory or inhibitory effects, which alter their own learning rate as a direct result of perceiving and classifying the digits. In this scenario, accurate recognition is desirable to the agent itself because it guides decisions to regulate its vulnerable internal states and functionality. Counterintuitively, the addition of vulnerability to a learner does not necessarily impair its performance. On the contrary, self-regulation in response to vulnerability confers benefits under certain conditions. We show that homeostatic design confers increased adaptability under concept shift, in which the relationships between labels and data change over time, and that the greatest advantages are obtained under the highest rates of shift. This necessitates the rapid un-learning of past associations and the re-learning of new ones. We also demonstrate the superior abilities of homeostatic learners in environments with dynamically changing rates of concept shift. Our homeostatic design exposes the artificial neural network's thinking machinery to the consequences of its own "thoughts", illustrating the advantage of putting one's own "skin in the game" to improve fluid intelligence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2020

Assessing Intelligence in Artificial Neural Networks

The purpose of this work was to develop of metrics to assess network arc...
research
10/27/2019

Inherent Weight Normalization in Stochastic Neural Networks

Multiplicative stochasticity such as Dropout improves the robustness and...
research
04/05/2018

The structure of evolved representations across different substrates for artificial intelligence

Artificial neural networks (ANNs), while exceptionally useful for classi...
research
09/12/2023

Life-inspired Interoceptive Artificial Intelligence for Autonomous and Adaptive Agents

Building autonomous — i.e., choosing goals based on one's needs – and ad...
research
08/11/2020

Supersymmetric Artificial Neural Network

The “Supersymmetric Artificial Neural Network” hypothesis (with the nove...
research
11/27/2017

Context-modulation of hippocampal dynamics and deep convolutional networks

Complex architectures of biological neural circuits, such as parallel pr...

Please sign up or login with your details

Forgot password? Click here to reset