Efficient Model Adaptation for Continual Learning at the Edge

08/03/2023
by   Zachary A. Daniels, et al.
0

Most machine learning (ML) systems assume stationary and matching data distributions during training and deployment. This is often a false assumption. When ML models are deployed on real devices, data distributions often shift over time due to changes in environmental factors, sensor characteristics, and task-of-interest. While it is possible to have a human-in-the-loop to monitor for distribution shifts and engineer new architectures in response to these shifts, such a setup is not cost-effective. Instead, non-stationary automated ML (AutoML) models are needed. This paper presents the Encoder-Adaptor-Reconfigurator (EAR) framework for efficient continual learning under domain shifts. The EAR framework uses a fixed deep neural network (DNN) feature encoder and trains shallow networks on top of the encoder to handle novel data. The EAR framework is capable of 1) detecting when new data is out-of-distribution (OOD) by combining DNNs with hyperdimensional computing (HDC), 2) identifying low-parameter neural adaptors to adapt the model to the OOD data using zero-shot neural architecture search (ZS-NAS), and 3) minimizing catastrophic forgetting on previous tasks by progressively growing the neural architecture as needed and dynamically routing data through the appropriate adaptors and reconfigurators for handling domain-incremental and class-incremental continual learning. We systematically evaluate our approach on several benchmark datasets for domain adaptation and demonstrate strong performance compared to state-of-the-art algorithms for OOD detection and few-/zero-shot NAS.

READ FULL TEXT
research
06/07/2020

Efficient Architecture Search for Continual Learning

Continual learning with neural networks is an important learning framewo...
research
05/09/2023

DOCTOR: A Multi-Disease Detection Continual Learning Framework Based on Wearable Medical Sensors

Modern advances in machine learning (ML) and wearable medical sensors (W...
research
01/28/2023

Adversarial Learning Networks: Source-free Unsupervised Domain Incremental Learning

This work presents an approach for incrementally updating deep neural ne...
research
11/06/2022

Momentum-based Weight Interpolation of Strong Zero-Shot Models for Continual Learning

Large pre-trained, zero-shot capable models have shown considerable succ...
research
03/12/2019

Continual Learning in Practice

This paper describes a reference architecture for self-maintaining syste...
research
10/17/2021

Growing Representation Learning

Machine learning continues to grow in popularity due to its ability to l...
research
08/20/2021

Continual Learning for Image-Based Camera Localization

For several emerging technologies such as augmented reality, autonomous ...

Please sign up or login with your details

Forgot password? Click here to reset