Relation learning in a neurocomputational architecture supports cross-domain transfer

10/11/2019
by   Leonidas A. A. Doumas, et al.
0

People readily generalise prior knowledge to novel situations and stimuli. Advances in machine learning and artificial intelligence have begun to approximate and even surpass human performance in specific domains, but machine learning systems struggle to generalise information to untrained situations. We present and model that demonstrates human-like extrapolatory generalisation by learning and explicitly representing an open-ended set of relations characterising regularities within the domains it is exposed to. First, when trained to play one video game (e.g., Breakout). the model generalises to a new game (e.g., Pong) with different rules, dimensions, and characteristics in a single shot. Second, the model can learn representations from a different domain (e.g., 3D shape images) that support learning a video game and generalising to a new game in one shot. By exploiting well-established principles from cognitive psychology and neuroscience, the model learns structured representations without feedback, and without requiring knowledge of the relevant relations to be given a priori. We present additional simulations showing that the representations that the model learns support cross-domain generalisation. The model's ability to generalise between different games demonstrates the flexible generalisation afforded by a capacity to learn not only statistical relations, but also other relations that are useful for characterising the domain to be learned. In turn, this kind of flexible, relational generalisation is only possible because the model is capable of representing relations explicitly, a capacity that is notably absent in extant statistical machine learning algorithms.

READ FULL TEXT
research
06/05/2018

Human-like generalization in a machine through predicate learning

Humans readily generalize, applying prior knowledge to novel situations ...
research
05/07/2021

Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders

Learning prerequisite chains is an essential task for efficiently acquir...
research
03/25/2022

Learning Relational Rules from Rewards

Humans perceive the world in terms of objects and relations between them...
research
12/13/2018

Coupled Representation Learning for Domains, Intents and Slots in Spoken Language Understanding

Representation learning is an essential problem in a wide range of appli...
research
03/30/2021

Probabilistic Analogical Mapping with Semantic Relation Networks

The human ability to flexibly reason with cross-domain analogies depends...
research
12/06/2018

Cross-Domain 3D Equivariant Image Embeddings

Spherical convolutional networks have been introduced recently as tools ...
research
07/08/2019

Knowledge-aware Pronoun Coreference Resolution

Resolving pronoun coreference requires knowledge support, especially for...

Please sign up or login with your details

Forgot password? Click here to reset