
Universal Transformers
Selfattentive feedforward sequence models have been shown to achieve i...
read it

Distilling Wikipedia mathematical knowledge into neural network models
Machine learning applications to symbolic mathematics are becoming incre...
read it

Variational Transformers for Diverse Response Generation
Despite the great promise of Transformers in many sequence modeling task...
read it

Thinking Like Transformers
What is the computational model behind a Transformer? Where recurrent ne...
read it

Are Transformers universal approximators of sequencetosequence functions?
Despite the widespread adoption of Transformer models for NLP tasks, the...
read it

ZeroShot Controlled Generation with EncoderDecoder Transformers
Controlling neural networkbased models for natural language generation ...
read it

Estimation of Gas Turbine Shaft Torque and Fuel Flow of a CODLAG Propulsion System Using Genetic Programming Algorithm
In this paper, the publicly available dataset of condition based mainten...
read it
Attending to Mathematical Language with Transformers
Mathematical expressions were generated, evaluated and used to train neural network models based on the transformer architecture. The expressions and their targets were analyzed as a characterlevel sequence transduction task in which the encoder and decoder are built on attention mechanisms. Three models were trained to understand and evaluate symbolic variables and expressions in mathematics: (1) the selfattentive and feedforward transformer without recurrence or convolution, (2) the universal transformer with recurrence, and (3) the adaptive universal transformer with recurrence and adaptive computation time. The models respectively achieved test accuracies as high as 76.1 and 83.9 cases inferred incorrectly, the results were very close to the targets. The models notably learned to add, subtract and multiply both positive and negative decimal numbers of variable digits assigned to symbolic variables.
READ FULL TEXT
Comments
There are no comments yet.