Bridge Networks

06/15/2021
by   Wilkie Olin-Ammentorp, et al.
0

Despite rapid progress, current deep learning methods face a number of critical challenges. These include high energy consumption, catastrophic forgetting, dependance on global losses, and an inability to reason symbolically. By combining concepts from information bottleneck theory and vector-symbolic architectures, we propose and implement a novel information processing architecture, the 'Bridge network.' We show this architecture provides unique advantages which can address the problem of global losses and catastrophic forgetting. Furthermore, we argue that it provides a further basis for increasing energy efficiency of execution and the ability to reason symbolically.

READ FULL TEXT

page 2

page 3

research
02/03/2021

Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting

One of the major limitations of deep learning models is that they face c...
research
11/25/2022

Overcoming Catastrophic Forgetting by XAI

Explaining the behaviors of deep neural networks, usually considered as ...
research
08/09/2021

Some thoughts on catastrophic forgetting and how to learn an algorithm

The work of McCloskey and Cohen popularized the concept of catastrophic ...
research
05/03/2020

Explaining How Deep Neural Networks Forget by Deep Visualization

Explaining the behaviors of deep neural networks, usually considered as ...
research
01/23/2020

Structured Compression and Sharing of Representational Space for Continual Learning

Humans are skilled at learning adaptively and efficiently throughout the...
research
01/17/2022

Evaluating Inexact Unlearning Requires Revisiting Forgetting

Existing works in inexact machine unlearning focus on achieving indistin...

Please sign up or login with your details

Forgot password? Click here to reset