Polytuplet Loss: A Reverse Approach to Training Reading Comprehension and Logical Reasoning Models

04/03/2023
by   Jeffrey Lu, et al.
0

Throughout schooling, students are tested on reading comprehension and logical reasoning. Students have developed various strategies for completing such exams, some of which are generally thought to outperform others. One such strategy involves emphasizing relative accuracy over absolute accuracy and can theoretically produce the correct answer without full knowledge of the information required to solve the question. This paper examines the effectiveness of applying such a strategy to train transfer learning models to solve reading comprehension and logical reasoning questions. The models were evaluated on the ReClor dataset, a challenging reading comprehension and logical reasoning benchmark. While previous studies targeted logical reasoning skills, we focus on a general training method and model architecture. We propose the polytuplet loss function, an extension of the triplet loss function, to ensure prioritization of learning the relative correctness of answer choices over learning the true accuracy of each choice. Our results indicate that models employing polytuplet loss outperform existing baseline models. Although polytuplet loss is a promising alternative to other contrastive loss functions, further research is required to quantify the benefits it may present.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/02/2020

R3: A Reading Comprehension Benchmark Requiring Reasoning Processes

Existing question answering systems can only predict answers without exp...
research
10/15/2019

NumNet: Machine Reading Comprehension with Numerical Reasoning

Numerical reasoning, such as addition, subtraction, sorting and counting...
research
05/10/2018

Towards Inference-Oriented Reading Comprehension: ParallelQA

In this paper, we investigate the tendency of end-to-end neural Machine ...
research
03/16/2022

AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine Reading Comprehension

Recent machine reading comprehension datasets such as ReClor and LogiQA ...
research
11/14/2017

Towards Human-level Machine Reading Comprehension: Reasoning and Inference with Multiple Strategies

This paper presents a new MRC model that is capable of three key compreh...
research
01/03/2020

Read Beyond the Lines: Understanding the Implied Textual Meaning via a Skim and Intensive Reading Model

The nonliteral interpretation of a text is hard to be understood by mach...
research
05/02/2022

Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning

Machine reading comprehension has aroused wide concerns, since it explor...

Please sign up or login with your details

Forgot password? Click here to reset