Pointwise HSIC: A Linear-Time Kernelized Co-occurrence Norm for Sparse Linguistic Expressions

09/04/2018
by   Sho Yokoi, et al.
0

In this paper, we propose a new kernel-based co-occurrence measure that can be applied to sparse linguistic expressions (e.g., sentences) with a very short learning time, as an alternative to pointwise mutual information (PMI). As well as deriving PMI from mutual information, we derive this new measure from the Hilbert--Schmidt independence criterion (HSIC); thus, we call the new measure the pointwise HSIC (PHSIC). PHSIC can be interpreted as a smoothed variant of PMI that allows various similarity metrics (e.g., sentence embeddings) to be plugged in as kernels. Moreover, PHSIC can be estimated by simple and fast (linear in the size of the data) matrix calculations regardless of whether we use linear or nonlinear kernels. Empirically, in a dialogue response selection task, PHSIC is learned thousands of times faster than an RNN-based PMI while outperforming PMI in accuracy. In addition, we also demonstrate that PHSIC is beneficial as a criterion of a data selection task for machine translation owing to its ability to give high (low) scores to a consistent (inconsistent) pair with other pairs.

READ FULL TEXT
research
06/16/2018

Information Aging through Queues: A Mutual Information Perspective

In this paper, we propose a new measure for the freshness of information...
research
06/23/2020

Distance Correlation Sure Independence Screening for Accelerated Feature Selection in Parkinson's Disease Vocal Data

With the abundance of machine learning methods available and the temptat...
research
03/15/2022

On Suspicious Coincidences and Pointwise Mutual Information

Barlow (1985) hypothesized that the co-occurrence of two events A and B ...
research
10/26/2022

InfoShape: Task-Based Neural Data Shaping via Mutual Information

The use of mutual information as a tool in private data sharing has rema...
research
02/15/2018

"Dependency Bottleneck" in Auto-encoding Architectures: an Empirical Study

Recent works investigated the generalization properties in deep neural n...
research
01/21/2015

A Bayesian alternative to mutual information for the hierarchical clustering of dependent random variables

The use of mutual information as a similarity measure in agglomerative h...
research
10/01/2008

Determining the Unithood of Word Sequences using Mutual Information and Independence Measure

Most works related to unithood were conducted as part of a larger effort...

Please sign up or login with your details

Forgot password? Click here to reset