A Quantum Many-body Wave Function Inspired Language Modeling Approach

08/28/2018
by   Peng Zhang, et al.
0

The recently proposed quantum language model (QLM) aimed at a principled approach to modeling term dependency by applying the quantum probability theory. The latest development for a more effective QLM has adopted word embeddings as a kind of global dependency information and integrated the quantum-inspired idea in a neural network architecture. While these quantum-inspired LMs are theoretically more general and also practically effective, they have two major limitations. First, they have not taken into account the interaction among words with multiple meanings, which is common and important in understanding natural language text. Second, the integration of the quantum-inspired LM with the neural network was mainly for effective training of parameters, yet lacking a theoretical foundation accounting for such integration. To address these two issues, in this paper, we propose a Quantum Many-body Wave Function (QMWF) inspired language modeling approach. The QMWF inspired LM can adopt the tensor product to model the aforesaid interaction among words. It also enables us to reveal the inherent necessity of using Convolutional Neural Network (CNN) in QMWF language modeling. Furthermore, our approach delivers a simple algorithm to represent and match text/sentence pairs. Systematic evaluation shows the effectiveness of the proposed QMWF-LM algorithm, in comparison with the state of the art quantum-inspired LMs and a couple of CNN-based methods, on three typical Question Answering (QA) datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2020

Quantum Language Model with Entanglement Embedding for Question Answering

Quantum Language Models (QLMs) in which words are modelled as quantum su...
research
06/01/2015

Learning to Answer Questions From Image Using Convolutional Neural Network

In this paper, we propose to employ the convolutional neural network (CN...
research
09/05/2016

PMI Matrix Approximations with Applications to Neural Language Modeling

The negative sampling (NEG) objective function, used in word2vec, is a s...
research
04/08/2021

Revisiting Simple Neural Probabilistic Language Models

Recent progress in language modeling has been driven not only by advance...
research
03/04/2023

Variational Quantum Classifiers for Natural-Language Text

As part of the recent research effort on quantum natural language proces...
research
03/02/2020

Tensor Networks for Language Modeling

The tensor network formalism has enjoyed over two decades of success in ...
research
11/28/2018

Exploiting "Quantum-like Interference" in Decision Fusion for Ranking Multimodal Documents

Fusing and ranking multimodal information remains always a challenging t...

Please sign up or login with your details

Forgot password? Click here to reset