Growing Representation Learning

10/17/2021
by   Ryan King, et al.
0

Machine learning continues to grow in popularity due to its ability to learn increasingly complex tasks. However, for many supervised models, the shift in a data distribution or the appearance of a new event can result in a severe decrease in model performance. Retraining a model from scratch with updated data can be resource intensive or impossible depending on the constraints placed on an organization or system. Continual learning methods attempt to adapt models to new classes instead of retraining. However, many of these methods do not have a detection method for new classes or make assumptions about the distribution of classes. In this paper, we develop an attention based Gaussian Mixture, called GMAT, that learns interpretable representations of data with or without labels. We incorporate this method with existing Neural Architecture Search techniques to develop an algorithm for detection new events for an optimal number of representations through an iterative process of training a growing. We show that our method is capable learning new representations of data without labels or assumptions about the distributions of labels. We additionally develop a method that allows our model to utilize labels to more accurately develop representations. Lastly, we show that our method can avoid catastrophic forgetting by replaying samples from learned representations.

READ FULL TEXT
research
03/24/2022

Probing Representation Forgetting in Supervised and Unsupervised Continual Learning

Continual Learning research typically focuses on tackling the phenomenon...
research
06/24/2020

Insights from the Future for Continual Learning

Continual learning aims to learn tasks sequentially, with (often severe)...
research
02/17/2021

Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks

We propose firefly neural architecture descent, a general framework for ...
research
08/03/2023

Efficient Model Adaptation for Continual Learning at the Edge

Most machine learning (ML) systems assume stationary and matching data d...
research
04/14/2021

Neural Architecture Search of Deep Priors: Towards Continual Learning without Catastrophic Interference

In this paper we analyze the classification performance of neural networ...
research
06/14/2021

FastICARL: Fast Incremental Classifier and Representation Learning with Efficient Budget Allocation in Audio Sensing Applications

Various incremental learning (IL) approaches have been proposed to help ...

Please sign up or login with your details

Forgot password? Click here to reset