Sequential Changepoint Detection in Neural Networks with Checkpoints

10/06/2020
by   Michalis K. Titsias, et al.
0

We introduce a framework for online changepoint detection and simultaneous model learning which is applicable to highly parametrized models, such as deep neural networks. It is based on detecting changepoints across time by sequentially performing generalized likelihood ratio tests that require only evaluations of simple prediction score functions. This procedure makes use of checkpoints, consisting of early versions of the actual model parameters, that allow to detect distributional changes by performing predictions on future data. We define an algorithm that bounds the Type I error in the sequential testing procedure. We demonstrate the efficiency of our method in challenging continual learning applications with unknown task changepoints, and show improved performance compared to online Bayesian changepoint detection.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2019

A Unifying Bayesian View of Continual Learning

Some machine learning applications require continual learning - where da...
research
02/10/2021

Sequential change-point detection for mutually exciting point processes over networks

We present a new CUSUM procedure for sequentially detecting change-point...
research
08/02/2019

Toward Understanding Catastrophic Forgetting in Continual Learning

We study the relationship between catastrophic forgetting and properties...
research
10/22/2020

Drift Detection in Episodic Data: Detect When Your Agent Starts Faltering

Detection of deterioration of agent performance in dynamic environments ...
research
11/24/2020

Generalized Variational Continual Learning

Continual learning deals with training models on new tasks and datasets ...
research
10/09/2020

Linear Mode Connectivity in Multitask and Continual Learning

Continual (sequential) training and multitask (simultaneous) training ar...

Please sign up or login with your details

Forgot password? Click here to reset