
Physicsbased Neural Networks for Shape from Polarization
How should prior knowledge from physics inform a neural network solution...
read it

StronglyTyped Recurrent Neural Networks
Recurrent neural networks are increasing popular models for sequential l...
read it

Selfexplaining variational posterior distributions for Gaussian Process models
Bayesian methods have become a popular way to incorporate prior knowledg...
read it

Learning with Known Operators reduces Maximum Training Error Bounds
We describe an approach for incorporating prior knowledge into machine l...
read it

Statistical physics of unsupervised learning with prior knowledge in neural networks
Integrating sensory inputs with prior beliefs from past experiences in u...
read it

Learning the exchangecorrelation functional from nature with fully differentiable density functional theory
Improving the predictive capability of molecular properties in ab initio...
read it

An Iterative Scientific Machine Learning Approach for Discovery of Theories Underlying Physical Phenomena
Form a pure mathematical point of view, common functional forms represen...
read it
KohnSham equations as regularizer: building prior knowledge into machinelearned physics
Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures. Prior knowledge embedded in the physics computation itself rarely draws attention. We show that solving the KohnSham equations when training neural networks for the exchangecorrelation functional provides an implicit regularization that greatly improves generalization. Two separations suffice for learning the entire onedimensional H_2 dissociation curve within chemical accuracy, including the strongly correlated region. Our models also generalize to unseen types of molecules and overcome selfinteraction error.
READ FULL TEXT
Comments
There are no comments yet.