CQural: A Novel CNN based Hybrid Architecture for Quantum Continual Machine Learning

05/16/2023
by   Sanyam Jain, et al.
0

Training machine learning models in an incremental fashion is not only important but also an efficient way to achieve artificial general intelligence. The ability that humans possess of continuous or lifelong learning helps them to not forget previously learned tasks. However, current neural network models are prone to catastrophic forgetting when it comes to continual learning. Many researchers have come up with several techniques in order to reduce the effect of forgetting from neural networks, however, all techniques are studied classically with a very less focus on changing the machine learning model architecture. In this research paper, we show that it is not only possible to circumvent catastrophic forgetting in continual learning with novel hybrid classical-quantum neural networks, but also explains what features are most important to learn for classification. In addition, we also claim that if the model is trained with these explanations, it tends to give better performance and learn specific features that are far from the decision boundary. Finally, we present the experimental results to show comparisons between classical and classical-quantum hybrid architectures on benchmark MNIST and CIFAR-10 datasets. After successful runs of learning procedure, we found hybrid neural network outperforms classical one in terms of remembering the right evidences of the class-specific features.

READ FULL TEXT
research
08/05/2021

Quantum Continual Learning Overcoming Catastrophic Forgetting

Catastrophic forgetting describes the fact that machine learning models ...
research
03/26/2022

Continual learning of quantum state classification with gradient episodic memory

Continual learning is one of the many areas of machine learning research...
research
02/21/2023

Effects of Architectures on Continual Semantic Segmentation

Research in the field of Continual Semantic Segmentation is mainly inves...
research
07/15/2020

SpaceNet: Make Free Space For Continual Learning

The continual learning (CL) paradigm aims to enable neural networks to l...
research
01/23/2020

Structured Compression and Sharing of Representational Space for Continual Learning

Humans are skilled at learning adaptively and efficiently throughout the...
research
01/04/2021

CLeaR: An Adaptive Continual Learning Framework for Regression Tasks

Catastrophic forgetting means that a trained neural network model gradua...
research
06/28/2022

Hebbian Continual Representation Learning

Continual Learning aims to bring machine learning into a more realistic ...

Please sign up or login with your details

Forgot password? Click here to reset