DeepAI AI Chat
Log In Sign Up

Learning and analyzing vector encoding of symbolic representations

03/10/2018
by   Roland Fernandez, et al.
Microsoft
Johns Hopkins University
0

We present a formal language with expressions denoting general symbol structures and queries which access information in those structures. A sequence-to-sequence network processing this language learns to encode symbol structures and query them. The learned representation (approximately) shares a simple linearity property with theoretical techniques for performing this task.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/28/2019

Extending de Bruijn sequences to larger alphabets

A circular de Bruijn sequence of order n in an alphabet of k symbols is ...
07/19/2017

Language Transfer of Audio Word2Vec: Learning Audio Segment Representations without Target Language Data

Audio Word2Vec offers vector representations of fixed dimensionality for...
06/06/2022

Symbolic Knowledge Structures and Intuitive Knowledge Structures

This paper proposes that two distinct types of structures are present in...
10/05/2019

Natural- to formal-language generation using Tensor Product Representations

Generating formal-language represented by relational tuples, such as Lis...
06/01/1999

The Symbol Grounding Problem

How can the semantic interpretation of a formal symbol system be made in...
07/03/2017

Multiscale sequence modeling with a learned dictionary

We propose a generalization of neural network sequence models. Instead o...
09/03/2021

Symbol Emergence and The Solutions to Any Task

The following defines intent, an arbitrary task and its solutions, and t...