Gauge-equivariant neural networks as preconditioners in lattice QCD

02/10/2023
by   Christoph Lehner, et al.
0

We demonstrate that a state-of-the art multi-grid preconditioner can be learned efficiently by gauge-equivariant neural networks. We show that the models require minimal re-training on different gauge configurations of the same gauge ensemble and to a large extent remain efficient under modest modifications of ensemble parameters. We also demonstrate that important paradigms such as communication avoidance are straightforward to implement in this framework.

READ FULL TEXT
research
04/09/2019

Intra-Ensemble in Neural Networks

Improving model performance is always the key problem in machine learnin...
research
03/20/2023

Geometrical aspects of lattice gauge equivariant convolutional neural networks

Lattice gauge equivariant convolutional neural networks (L-CNNs) are a f...
research
01/31/2022

DNS: Determinantal Point Process Based Neural Network Sampler for Ensemble Reinforcement Learning

Application of ensemble of neural networks is becoming an imminent tool ...
research
05/14/2022

Minimal-Perimeter Lattice Animals and the Constant-Isomer Conjecture

We consider minimal-perimeter lattice animals, providing a set of condit...
research
05/05/2023

Equivariant Neural Networks for Spin Dynamics Simulations of Itinerant Magnets

I present a novel equivariant neural network architecture for the large-...
research
08/11/2023

Composable Function-preserving Expansions for Transformer Architectures

Training state-of-the-art neural networks requires a high cost in terms ...
research
05/04/2019

NAMSG: An Efficient Method For Training Neural Networks

We introduce NAMSG, an adaptive first-order algorithm for training neura...

Please sign up or login with your details

Forgot password? Click here to reset