DeepAI AI Chat
Log In Sign Up

Kohn-Sham equations as regularizer: building prior knowledge into machine-learned physics

09/17/2020
by   Li Li, et al.
0

Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures. Prior knowledge embedded in the physics computation itself rarely draws attention. We show that solving the Kohn-Sham equations when training neural networks for the exchange-correlation functional provides an implicit regularization that greatly improves generalization. Two separations suffice for learning the entire one-dimensional H_2 dissociation curve within chemical accuracy, including the strongly correlated region. Our models also generalize to unseen types of molecules and overcome self-interaction error.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/31/2022

When Physics Meets Machine Learning: A Survey of Physics-Informed Machine Learning

Physics-informed machine learning (PIML), referring to the combination o...
02/06/2016

Strongly-Typed Recurrent Neural Networks

Recurrent neural networks are increasing popular models for sequential l...
03/25/2019

Physics-based Neural Networks for Shape from Polarization

How should prior knowledge from physics inform a neural network solution...
09/08/2021

Self-explaining variational posterior distributions for Gaussian Process models

Bayesian methods have become a popular way to incorporate prior knowledg...
07/03/2019

Learning with Known Operators reduces Maximum Training Error Bounds

We describe an approach for incorporating prior knowledge into machine l...
02/08/2021

Learning the exchange-correlation functional from nature with fully differentiable density functional theory

Improving the predictive capability of molecular properties in ab initio...