Dynamically Addressing Unseen Rumor via Continual Learning

by   Nayeon Lee, et al.

Rumors are often associated with newly emerging events, thus, an ability to deal with unseen rumors is crucial for a rumor veracity classification model. Previous works address this issue by improving the model's generalizability, with an assumption that the model will stay unchanged even after the new outbreak of an event. In this work, we propose an alternative solution to continuously update the model in accordance with the dynamics of rumor domain creations. The biggest technical challenge associated with this new approach is the catastrophic forgetting of previous learnings due to new learnings. We adopt continual learning strategies that control the new learnings to avoid catastrophic forgetting and propose an additional strategy that can jointly be used to strengthen the forgetting alleviation.


page 1

page 2

page 3

page 4


Does Continual Learning = Catastrophic Forgetting?

Continual learning is known for suffering from catastrophic forgetting, ...

Continual learning: a feature extraction formalization, an efficient algorithm, and fundamental obstructions

Continual learning is an emerging paradigm in machine learning, wherein ...

Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners

In the SSLAD-Track 3B challenge on continual learning, we propose the me...

Continual Learning For On-Device Environmental Sound Classification

Continuously learning new classes without catastrophic forgetting is a c...

Continual Learning for Human State Monitoring

Continual Learning (CL) on time series data represents a promising but u...

Improving Pedestrian Prediction Models with Self-Supervised Continual Learning

Autonomous mobile robots require accurate human motion predictions to sa...

Transfer without Forgetting

This work investigates the entanglement between Continual Learning (CL) ...