LSTM-based Mixture-of-Experts for Knowledge-Aware Dialogues

05/05/2016
by   Phong Le, et al.
0

We introduce an LSTM-based method for dynamically integrating several word-prediction experts to obtain a conditional language model which can be good simultaneously at several subtasks. We illustrate this general approach with an application to dialogue where we integrate a neural chat model, good at conversational aspects, with a neural question-answering model, good at retrieving precise information from a knowledge-base, and show how the integration combines the strengths of the independent components. We hope that this focused contribution will attract attention on the benefits of using such mixtures of experts in NLP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2019

Improving Knowledge-aware Dialogue Generation via Knowledge Base Question Answering

Neural network models usually suffer from the challenge of incorporating...
research
09/26/2019

Spoken Conversational Search for General Knowledge

We present a spoken conversational question answering proof of concept t...
research
06/03/2016

Question Answering over Knowledge Base with Neural Attention Combining Global Knowledge Information

With the rapid growth of knowledge bases (KBs) on the web, how to take f...
research
11/01/2022

Contextual Mixture of Experts: Integrating Knowledge into Predictive Modeling

This work proposes a new data-driven model devised to integrate process ...
research
01/24/2022

Artefact Retrieval: Overview of NLP Models with Knowledge Base Access

Many NLP models gain performance by having access to a knowledge base. A...
research
09/27/2021

Knowledge-Aware Neural Networks for Medical Forum Question Classification

Online medical forums have become a predominant platform for answering h...

Please sign up or login with your details

Forgot password? Click here to reset