Matching Text with Deep Mutual Information Estimation

03/09/2020
by   Xixi Zhou, et al.
0

Text matching is a core natural language processing research problem. How to retain sufficient information on both content and structure information is one important challenge. In this paper, we present a neural approach for general-purpose text matching with deep mutual information estimation incorporated. Our approach, Text matching with Deep Info Max (TIM), is integrated with a procedure of unsupervised learning of representations by maximizing the mutual information between text matching neural network's input and output. We use both global and local mutual information to learn text representations. We evaluate our text matching approach on several tasks including natural language inference, paraphrase identification, and answer selection. Compared to the state-of-the-art approaches, the experiments show that our method integrated with mutual information estimation learns better text representation and achieves better experimental results of text matching tasks without exploiting pretraining on external data.

READ FULL TEXT
research
03/08/2021

Multimodal Representation Learning via Maximization of Local Mutual Information

We propose and demonstrate a representation learning approach by maximiz...
research
06/01/2021

Distribution Matching for Rationalization

The task of rationalization aims to extract pieces of input text as rati...
research
08/01/2019

Simple and Effective Text Matching with Richer Alignment Features

In this paper, we present a fast and strong neural approach for general ...
research
02/26/2020

Learning Adversarially Robust Representations via Worst-Case Mutual Information Maximization

Training machine learning models to be robust against adversarial inputs...
research
12/04/2021

Representation Learning for Conversational Data using Discourse Mutual Information Maximization

Although many pretrained models exist for text or images, there have bee...
research
03/13/2020

Learning Unbiased Representations via Mutual Information Backpropagation

We are interested in learning data-driven representations that can gener...
research
11/06/2016

A Compare-Aggregate Model for Matching Text Sequences

Many NLP tasks including machine comprehension, answer selection and tex...

Please sign up or login with your details

Forgot password? Click here to reset