Table-based Fact Verification with Self-adaptive Mixture of Experts

04/19/2022
by   Yuxuan Zhou, et al.
0

The table-based fact verification task has recently gained widespread attention and yet remains to be a very challenging problem. It inherently requires informative reasoning over natural language together with different numerical and logical reasoning on tables (e.g., count, superlative, comparative). Considering that, we exploit mixture-of-experts and present in this paper a new method: Self-adaptive Mixture-of-Experts Network (SaMoE). Specifically, we have developed a mixture-of-experts neural network to recognize and execute different types of reasoning – the network is composed of multiple experts, each handling a specific part of the semantics for reasoning, whereas a management module is applied to decide the contribution of each expert network to the verification result. A self-adaptive method is developed to teach the management module combining results of different experts more efficiently without external knowledge. The experimental results illustrate that our framework achieves 85.1 TabFact, comparable with the previous state-of-the-art models. We hope our framework can serve as a new baseline for table-based verification. Our code is available at https://github.com/THUMLP/SaMoE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2022

Toward a Unified Framework for Unsupervised Complex Tabular Reasoning

Structured tabular data exist across nearly all fields. Reasoning task o...
research
09/05/2019

TabFact: A Large-scale Dataset for Table-based Fact Verification

The problem of verifying whether a textual hypothesis holds the truth ba...
research
12/19/2022

Large Language Models are reasoners with Self-Verification

When a large language model (LLM) performs complex reasoning by chain of...
research
09/09/2021

Table-based Fact Verification with Salience-aware Learning

Tables provide valuable knowledge that can be used to verify textual sta...
research
03/04/2023

Prismer: A Vision-Language Model with An Ensemble of Experts

Recent vision-language models have shown impressive multi-modal generati...
research
03/07/2023

LORE: Logical Location Regression Network for Table Structure Recognition

Table structure recognition (TSR) aims at extracting tables in images in...
research
09/11/2023

Pushing Mixture of Experts to the Limit: Extremely Parameter Efficient MoE for Instruction Tuning

The Mixture of Experts (MoE) is a widely known neural architecture where...

Please sign up or login with your details

Forgot password? Click here to reset