Hybrid Self-Attention NEAT: A novel evolutionary approach to improve the NEAT algorithm

12/07/2021
by   Saman Khamesian, et al.
0

This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are high dimensional, it cannot create a well-tuned network. Our study addresses this limitation by using self-attention as an indirect encoding method to select the most important parts of the input. In addition, we improve its overall performance with the help of a hybrid method to evolve the final network weights. The main conclusion is that Hybrid Self- Attention NEAT can eliminate the restriction of the original NEAT. The results indicate that in comparison with evolutionary algorithms, our model can get comparable scores in Atari games with raw pixels input with a much lower number of parameters.

READ FULL TEXT

page 5

page 9

page 10

page 12

research
06/03/2022

EAANet: Efficient Attention Augmented Convolutional Networks

Humans can effectively find salient regions in complex scenes. Self-atte...
research
09/05/2019

Accelerating Transformer Decoding via a Hybrid of Self-attention and Recurrent Neural Network

Due to the highly parallelizable architecture, Transformer is faster to ...
research
10/02/2020

Group Equivariant Stand-Alone Self-Attention For Vision

We provide a general self-attention formulation to impose group equivari...
research
10/23/2019

A Transformer with Interleaved Self-attention and Convolution for Hybrid Acoustic Models

Transformer with self-attention has achieved great success in the area o...
research
03/26/2018

Self-Attentional Acoustic Models

Self-attention is a method of encoding sequences of vectors by relating ...
research
11/11/2019

A hybrid text normalization system using multi-head self-attention for mandarin

In this paper, we propose a hybrid text normalization system using multi...
research
03/18/2020

Neuroevolution of Self-Interpretable Agents

Inattentional blindness is the psychological phenomenon that causes one ...

Please sign up or login with your details

Forgot password? Click here to reset