Meta-Learning with Less Forgetting on Large-Scale Non-Stationary Task Distributions

09/03/2022
by   Zhenyi Wang, et al.
0

The paradigm of machine intelligence moves from purely supervised learning to a more practical scenario when many loosely related unlabeled data are available and labeled data is scarce. Most existing algorithms assume that the underlying task distribution is stationary. Here we consider a more realistic and challenging setting in that task distributions evolve over time. We name this problem as Semi-supervised meta-learning with Evolving Task diStributions, abbreviated as SETS. Two key challenges arise in this more realistic setting: (i) how to use unlabeled data in the presence of a large amount of unlabeled out-of-distribution (OOD) data; and (ii) how to prevent catastrophic forgetting on previously learned task distributions due to the task distribution shift. We propose an OOD Robust and knowleDge presErved semi-supeRvised meta-learning approach (ORDER), to tackle these two major challenges. Specifically, our ORDER introduces a novel mutual information regularization to robustify the model with unlabeled OOD data and adopts an optimal transport regularization to remember previously learned knowledge in feature space. In addition, we test our method on a very challenging dataset: SETS on large-scale non-stationary semi-supervised task distributions consisting of (at least) 72K tasks. With extensive experiments, we demonstrate the proposed ORDER alleviates forgetting on evolving task distributions and is more robust to OOD data than related strong baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2020

Matching Distributions via Optimal Transport for Semi-Supervised Learning

Semi-Supervised Learning (SSL) approaches have been an influential frame...
research
11/23/2022

Semi-Supervised Lifelong Language Learning

Lifelong learning aims to accumulate knowledge and alleviate catastrophi...
research
07/22/2022

Discrete Key-Value Bottleneck

Deep neural networks perform well on prediction and classification tasks...
research
01/30/2022

PLATINUM: Semi-Supervised Model Agnostic Meta-Learning using Submodular Mutual Information

Few-shot classification (FSC) requires training models using a few (typi...
research
05/25/2022

Semi-supervised Drifted Stream Learning with Short Lookback

In many scenarios, 1) data streams are generated in real time; 2) labele...
research
07/17/2020

A Review of Meta-level Learning in the Context of Multi-component, Multi-level Evolving Prediction Systems

The exponential growth of volume, variety and velocity of data is raisin...
research
11/26/2020

Comparative Analysis of Extreme Verification Latency Learning Algorithms

One of the more challenging real-world problems in computational intelli...

Please sign up or login with your details

Forgot password? Click here to reset