Probabilistic Reasoning via Deep Learning: Neural Association Models

03/24/2016
by   Quan Liu, et al.
1

In this paper, we propose a new deep learning approach, called neural association model (NAM), for probabilistic reasoning in artificial intelligence. We propose to use neural networks to model association between any two events in a domain. Neural networks take one event as input and compute a conditional probability of the other event to model how likely these two events are to be associated. The actual meaning of the conditional probabilities varies between applications and depends on how the models are trained. In this work, as two case studies, we have investigated two NAM structures, namely deep neural networks (DNN) and relation-modulated neural nets (RMNN), on several probabilistic reasoning tasks in AI, including recognizing textual entailment, triple classification in multi-relational knowledge bases and commonsense reasoning. Experimental results on several popular datasets derived from WordNet, FreeBase and ConceptNet have all demonstrated that both DNNs and RMNNs perform equally well and they can significantly outperform the conventional methods available for these reasoning tasks. Moreover, compared with DNNs, RMNNs are superior in knowledge transfer, where a pre-trained model can be quickly extended to an unseen relation after observing only a few training samples. To further prove the effectiveness of the proposed models, in this work, we have applied NAMs to solving challenging Winograd Schema (WS) problems. Experiments conducted on a set of WS problems prove that the proposed models have the potential for commonsense reasoning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2018

ATOMIC: An Atlas of Machine Commonsense for If-Then Reasoning

We present ATOMIC, an atlas of everyday commonsense reasoning, organized...
research
06/07/2018

A Simple Method for Commonsense Reasoning

Commonsense reasoning is a long-standing challenge for deep learning. Fo...
research
05/04/2022

Great Truths are Always Simple: A Rather Simple Knowledge Encoder for Enhancing the Commonsense Reasoning Capacity of Pre-Trained Models

Commonsense reasoning in natural language is a desired ability of artifi...
research
05/14/2021

Neural-Symbolic Commonsense Reasoner with Relation Predictors

Commonsense reasoning aims to incorporate sets of commonsense facts, ret...
research
11/13/2016

Commonsense Knowledge Enhanced Embeddings for Solving Pronoun Disambiguation Problems in Winograd Schema Challenge

In this paper, we propose commonsense knowledge enhanced embeddings (KEE...
research
03/17/2020

DEPARA: Deep Attribution Graph for Deep Knowledge Transferability

Exploring the intrinsic interconnections between the knowledge encoded i...
research
10/02/2022

Does Wikidata Support Analogical Reasoning?

Analogical reasoning methods have been built over various resources, inc...

Please sign up or login with your details

Forgot password? Click here to reset