Attention-based Neural Cellular Automata

11/02/2022
by   Mattie Tesfaldet, et al.
0

Recent extensions of Cellular Automata (CA) have incorporated key ideas from modern deep learning, dramatically extending their capabilities and catalyzing a new family of Neural Cellular Automata (NCA) techniques. Inspired by Transformer-based architectures, our work presents a new class of attention-based NCAs formed using a spatially localizedx2014yet globally organizedx2014self-attention scheme. We introduce an instance of this class named Vision Transformer Cellular Automata (ViTCA). We present quantitative and qualitative results on denoising autoencoding across six benchmark datasets, comparing ViTCA to a U-Net, a U-Net-based CA baseline (UNetCA), and a Vision Transformer (ViT). When comparing across architectures configured to similar parameter complexity, ViTCA architectures yield superior performance across all benchmarks and for nearly every evaluation metric. We present an ablation study on various architectural configurations of ViTCA, an analysis of its effect on cell states, and an investigation on its inductive biases. Finally, we examine its learned representations via linear probes on its converged cell state hidden representations, yielding, on average, superior results when compared to our U-Net, ViT, and UNetCA baselines.

READ FULL TEXT

page 1

page 2

page 3

page 8

page 16

page 18

page 19

research
10/11/2002

On the Cell-based Complexity of Recognition of Bounded Configurations by Finite Dynamic Cellular Automata

This paper studies complexity of recognition of classes of bounded confi...
research
08/19/2019

Freezing, Bounded-Change and Convergent Cellular Automata

This paper studies three classes of cellular automata from a computation...
research
09/05/2023

BeeTLe: A Framework for Linear B-Cell Epitope Prediction and Classification

The process of identifying and characterizing B-cell epitopes, which are...
research
03/27/2021

Generalization over different cellular automata rules learned by a deep feed-forward neural network

To test generalization ability of a class of deep neural networks, we ra...
research
10/24/2019

On the Complexity of Asynchronous Freezing Cellular Automata

In this paper we study the family of freezing cellular automata (FCA) in...
research
12/01/2022

Convolution, aggregation and attention based deep neural networks for accelerating simulations in mechanics

Deep learning surrogate models are being increasingly used in accelerati...

Please sign up or login with your details

Forgot password? Click here to reset