Continual BERT: Continual Learning for Adaptive Extractive Summarization of COVID-19 Literature

07/07/2020
by   Jong Won Park, et al.
0

The scientific community continues to publish an overwhelming amount of new research related to COVID-19 on a daily basis, leading to much literature without little to no attention. To aid the community in understanding the rapidly flowing array of COVID-19 literature, we propose a novel BERT architecture that provides a brief yet original summarization of lengthy papers. The model continually learns on new data in online fashion while minimizing catastrophic forgetting, thus fitting to the need of the community. Benchmark and manual examination of its performance show that the model provide a sound summary of new scientific literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2020

Automatic Text Summarization of COVID-19 Medical Research Articles using BERT and GPT-2

With the COVID-19 pandemic, there is a growing urgency for medical commu...
research
03/27/2022

Continual learning: a feature extraction formalization, an efficient algorithm, and fundamental obstructions

Continual learning is an emerging paradigm in machine learning, wherein ...
research
05/16/2023

Online Continual Learning Without the Storage Constraint

Online continual learning (OCL) research has primarily focused on mitiga...
research
12/13/2022

3rd Continual Learning Workshop Challenge on Egocentric Category and Instance Level Object Understanding

Continual Learning, also known as Lifelong or Incremental Learning, has ...
research
12/07/2020

COVIDScholar: An automated COVID-19 research aggregation and analysis platform

The ongoing COVID-19 pandemic has had far-reaching effects throughout so...
research
08/04/2020

COVID-19 Kaggle Literature Organization

The world has faced the devastating outbreak of Severe Acute Respiratory...
research
09/08/2023

Navigating Out-of-Distribution Electricity Load Forecasting during COVID-19: A Continual Learning Approach Leveraging Human Mobility

In traditional deep learning algorithms, one of the key assumptions is t...

Please sign up or login with your details

Forgot password? Click here to reset