Using NLU in Context for Question Answering: Improving on Facebook's bAbI Tasks

09/13/2017
by   John S. Ball, et al.
0

For the next step in human to machine interaction, Artificial Intelligence (AI) should interact predominantly using natural language because, if it worked, it would be the fastest way to communicate. Facebook's toy tasks (bAbI) provide a useful benchmark to compare implementations for conversational AI. While the published experiments so far have been based on exploiting the distributional hypothesis with machine learning, our model exploits natural language understanding (NLU) with the decomposition of language based on Role and Reference Grammar (RRG) and the brain-based Patom theory. Our combinatorial system for conversational AI based on linguistics has many advantages: passing bAbI task tests without parsing or statistics while increasing scalability. Our model validates both the training and test data to find 'garbage' input and output (GIGO). It is not rules-based, nor does it use parts of speech, but instead relies on meaning. While Deep Learning is difficult to debug and fix, every step in our model can be understood and changed like any non-statistical computer program. Deep Learning's lack of explicable reasoning has raised opposition to AI, partly due to fear of the unknown. To support the goals of AI, we propose extended tasks to use human-level statements with tense, aspect and voice, and embedded clauses with junctures: and answers to be natural language generation (NLG) instead of keywords. While machine learning permits invalid training data to produce incorrect test responses, our system cannot because the context tracking would need to be intentionally broken. We believe no existing learning systems can currently solve these extended natural language tests. There appears to be a knowledge gap between NLP researchers and linguists, but ongoing competitive results such as these promise to narrow that gap.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2021

Natural Language Generation Using Link Grammar for General Conversational Intelligence

Many current artificial general intelligence (AGI) and natural language ...
research
01/17/2021

Understanding in Artificial Intelligence

Current Artificial Intelligence (AI) methods, most based on deep learnin...
research
12/21/2022

Spoken Language Understanding for Conversational AI: Recent Advances and Future Direction

When a human communicates with a machine using natural language on the w...
research
03/06/2020

Natural Language QA Approaches using Reasoning with External Knowledge

Question answering (QA) in natural language (NL) has been an important a...
research
02/22/2022

Adversarial Attacks on Speech Recognition Systems for Mission-Critical Applications: A Survey

A Machine-Critical Application is a system that is fundamentally necessa...
research
09/18/2019

Information Extraction Tool Text2ALM: From Narratives to Action Language System Descriptions

In this work we design a narrative understanding tool Text2ALM. This too...

Please sign up or login with your details

Forgot password? Click here to reset