COVID-19: Comparative Analysis of Methods for Identifying Articles Related to Therapeutics and Vaccines without Using Labeled Data

01/05/2021
by   Mihir Parmar, et al.
2

Here we proposed an approach to analyze text classification methods based on the presence or absence of task-specific terms (and their synonyms) in the text. We applied this approach to study six different transfer-learning and unsupervised methods for screening articles relevant to COVID-19 vaccines and therapeutics. The analysis revealed that while a BERT model trained on search-engine results generally performed well, it miss-classified relevant abstracts that did not contain task-specific terms. We used this insight to create a more effective unsupervised ensemble.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2019

An Investigation of Transfer Learning-Based Sentiment Analysis in Japanese

Text classification approaches have usually required task-specific model...
research
07/12/2021

COPER: a Query-adaptable Semantics-based Search Engine for Persian COVID-19 Articles

With the surge of pretrained language models, a new pathway has been ope...
research
04/24/2020

Target specific mining of COVID-19 scholarly articles using one-class approach

In recent years, several research articles have been published in the fi...
research
12/10/2019

Unsupervised Transfer Learning via BERT Neuron Selection

Recent advancements in language representation models such as BERT have ...
research
09/16/2022

Comprehensive identification of Long Covid articles with human-in-the-loop machine learning

A significant percentage of COVID-19 survivors experience ongoing multis...
research
04/13/2020

Cascade Neural Ensemble for Identifying Scientifically Sound Articles

Background: A significant barrier to conducting systematic reviews and m...
research
02/15/2021

Identifying Misinformation from Website Screenshots

Can the look and the feel of a website give information about the trustw...

Please sign up or login with your details

Forgot password? Click here to reset