I Introduction
Machine learning (ML) methods reach ever deeper into quantum chemistry and materials simulation, delivering predictive models of interatomic potential energy surfaces Behler and Parrinello (2007); Braams and Bowman (2009); Bartók et al. (2010); Smith et al. (2017); Podryabinkin and Shapeev (2017); Podryabinkin et al. (2019), molecular forces Chmiela et al. (2017, 2018), electron densities Ryczko et al. (2018), density functionals Brockherde et al. (2017), and molecular response properties such as polarisabilities Wilkins et al. (2019), and infrared spectra Gastegger et al. (2017). Large data sets of molecular properties calculated from quantum chemistry or measured from experiment are equally being used to construct predictive models to explore the vast chemical compound space Rupp et al. (2012); Eickenberg et al. (2017); von Lilienfeld (2018); Gilmer et al. ; Jha et al. (2018) to find new sustainable catalyst materials Kitchin (2018), and to design new synthetic pathways Maryasin et al. (2018). Several works have also explored the potential role of machine learning in constructing approximate quantum chemical methods Li et al. (2018); Hegde and Bowen (2017).
Most existing ML models have in common that they learn from quantum chemistry to describe molecular properties as scalar, vector, or tensor fields
Grisafi et al. (2018); Thomas et al. (2018). Fig. 1ashows schematically how quantum chemistry data of different electronic properties, such as energies or dipole moments, is used to construct individual ML models for the respective properties. This allows for the efficient exploration of chemical space.with respect to these properties. Yet, these ML models do not explicitly capture the electronic degrees of freedom in molecules that lie at the heart of quantum chemistry. All chemical concepts and physical molecular properties are determined by the electronic Schrödinger equation and derive from the ground-state wavefunction. Thus, an electronic structure ML model that directly predicts the ground-state wavefunction (see Fig.
1b) would not only allow to obtain all ground-state properties, but could open avenues towards new approximate quantum chemistry methods based on an interface between ML and quantum chemistry.In this work, we develop a deep learning framework that provides an accurate ML model of molecular electronic structure via a direct representation of the electronic Hamiltonian in a local basis representation. The model provides a seamless interface between quantum mechanics and ML by predicting the eigenvalue spectrum and molecular orbitals (MOs) of the Hamiltonian for organic molecules close to ’chemical accuracy’ (
0.04 eV). This is achieved by training a flexible ML model to capture the chemical environment of atoms in molecules and of pairs of atoms. Thereby, it provides access to electronic properties that are important for chemical interpretation of reactions such as charge populations, bond orders, as well as dipole and quadrupole moments without the need of specialised ML models for each property. We demonstrate how this model retains the conceptual strength of quantum chemistry at force field efficiency by performing an ML-driven molecular dynamics simulation of malondialdehyde showing the evolution of the electronic structure during a proton transfer. As we obtain a symmetry-adapted and analytically differentiable representation of the electronic structure, we are able to optimise electronic properties, such as the HOMO-LUMO gap, in a step towards inverse design of molecular structures. Beyond that, we show that the electronic structure predicted by our approach may serve as input to further quantum chemical calculations. For example, wavefunction restarts based on this ML model provide a significant speed-up of the self-consistent field procedure (SCF) due to a reduced number of iterations, without loss of accuracy. The latter showcases that quantum chemistry and machine learning can be used in tandem for future electronic structure methods.Ii Results
ii.1 Atomic Representation of Molecular Electronic Structure
In quantum chemistry, the wavefunction associated with the electronic Hamiltonian is typically expressed by anti-symmetrised products of single-electron functions or molecular orbitals. These are represented in a local atomic orbital basis of spherical atomic functions with varying angular momentum. As a consequence, one can write the electronic Schrödinger equation in matrix form
(1) |
where the Hamiltonian matrix may correspond to the Fock or Kohn–Sham matrix, depending on the chosen level of theory Cramer (2004). In both cases, the Hamiltonian and overlap matrices are defined as:
(2) |
and
(3) |
The eigenvalues and electronic wavefunction coefficients contain the same information as and where the electronic eigenvalues are naturally invariant to rigid molecular rotations, translations or permutation of equivalent atoms. Unfortunately, as a function of atomic coordinates and changing molecular configurations, eigenvalues and wavefunction coefficients are not well-behaved or smooth. State degeneracies and electronic level crossings provide a challenge to the direct prediction of eigenvalues and wavefunctions with ML techniques. We address this problem with a deep learning architecture that directly describes the Hamiltonian matrix in local atomic orbital representation.
ii.2 SchNOrb deep learning framework
SchNOrb (SchNet for Orbitals) presents a framework that captures the electronic structure in a local representation of atomic orbitals that is common in quantum chemistry. Fig. 2a gives an overview of the proposed architecture. SchNOrb extends the deep tensor neural network SchNet Schütt et al. (2017) to represent electronic wavefunctions. The core idea is to construct symmetry-adapted pairwise features to represent the block of the Hamiltonian matrix corresponding to atoms . They are written as a product of rotationally invariant () and covariant () components which ensures that – given a sufficiently large feature space – all rotational symmetries up to angular momentum can be represented:
(4) | ||||
(5) |
Here, is the vector pointing from atom to atom , are rotationally invariant coefficients and are learnable parameters projecting the features along randomly chosen directions. This allows to rotate the different factors of relative to each other and further increases the flexibility of the model for . In case of , the coefficients are independent of the directions due to rotational invariance.
We obtain the coefficients from an atomistic neural network as shown in Fig. 2a. Starting from atom type embeddings , rotationally invariant representations of atomistic environments are computed by applying consecutive interaction refinements. These are by construction invariant with respect to rotation, translation and permutations of atoms. This part of the architecture is equivalent to the SchNet model for atomistic predictions (see Refs. Schütt et al. (2017, 2018a)). In addition, we construct representations of atom pairs that will enable the prediction of the coefficients . This is achieved by SchNOrb interaction blocks, which compute the coefficients with a given angular momentum with respect to the atomic environment of the respective atom pair . This corresponds to adapting the atomic orbital interaction based on the presence and position of atomic orbitals in the vicinity of the atom pair. As shown in Fig. 2b, the coefficient matrix depends on pair interactions of atoms as well as environment interactions of atom pairs and for neighboring atoms . These are crucial to enable the model to capture the orientation of the atom pair within the molecule for which pair-wise interactions of atomic environments are not sufficient.
The Hamiltonian matrix is obtained by treating on-site and off-site blocks separately. Given a basis of atomic orbitals up to angular momentum , we require pair-wise environments with angular momenta up to to describe all Hamiltonian blocks
(6) |
The predicted Hamiltonian is obtained through symmetrisation . and are modeled by neural networks that are described in detail in the methods section. The overlap matrix can be obtained in the same manner. In addition to the Hamiltonian and overlap matrices, we predict the total energy separately as a sum over atom-wise energy contributions, in analogy with the conventional SchNet treatment Schütt et al. (2018a).
ii.3 Learning electronic structure and derived properties
The proposed SchNOrb architecture allows us to perform predictions of total energies, Hamiltonian and overlap matrices in end-to-end fashion using a combined regression loss. We train neural networks for several data sets of the molecules water, ethanol, malondialdehyde, and uracil. The reference calculations were performed with Hartree-Fock (HF) and density functional theory (DFT) with the PBE exchange correlation functional Perdew et al. (1996a). The employed Gaussian atomic orbital bases include angular momenta up to (d-orbitals). Detailed model and training settings for each data set are listed in Supplementary Table S1.
As Supplementary Table S2 shows, the total energies could be predicted up to a mean absolute error below 2 meV for the molecules. The predictions show mean absolute errors below 8 meV for the Hamiltonian and below for the overlap matrices. We examine how these errors propagate to orbital energy and coefficients. Fig. 2e shows mean absolute errors for energies of the lowest 20 molecular orbitals for ethanol reference calculations using DFT as well as HF. The errors for the DFT reference data are consistently lower. Beyond that, the occupied orbitals (1-13) are predicted with higher accuracy (20 meV) than the virtual orbitals (100 meV). We conjecture that the larger error for virtual orbitals arises from the fact that these are not strictly defined by the underlying data from the HF and Kohn-Sham DFT calculations. Virtual orbitals are only defined up to an arbitrary unitary transformation. Their physical interpretation is limited and, in HF and DFT theory, they do not enter in the description of ground-state properties. For the remaining data sets, the average errors of the occupied orbitals are
10 meV for water and malondialdehyde as well as 48 meV for uracil. This is shown in detail in Supplementary Fig. S1. The orbital coefficients are predicted with cosine similarities
90% (see Supplementary Fig. S2). Fig. 2f depicts the predicted and reference orbital energies for the frontier MOs of ethanol (solid and dotted lines, respectively), as well as the orbital shapes derived from the coefficients. Both occupied and unoccupied energy levels are reproduced with high accuracy, including the highest occupied (HOMO) and lowest unoccupied orbitals (LUMO). This trend is also reflected in the overall shape of the orbitals. Even the slightly higher deviations in the orbital energies observed for the third and fourth unoccupied orbital only result in minor deformations. The learned covariance of molecular orbitals for rotations of a water molecule is shown in Supplementary Fig. S3.As SchNOrb learns the electronic structure of molecular systems, all chemical properties that are defined as quantum mechanical operators on the wavefunctions can be computed from the ML prediction without the need to train a separate model. We investigate this feature by directly calculating electronic dipole and quadrupole moments from the orbital coefficients predicted by SchNOrb. The corresponding mean absolute errors are reported in Supplementary Tab. S4. Excellent agreement with the electronic structure reference is observed for the majority of molecules (0.054 D for dipoles and 0.058 D Å for quadrupoles). The only deviation from this trend is observed for uracil, which exhibits a more complicated electronic structure due to its delocalised -system (see above). In this case, a similar accuracy as the other methods could in principle be reached upon the addition of more reference data points. The above results demonstrate the utility of combining a learned Hamiltonian with quantum operators. This makes it possible to access a wide range of chemical properties without the need for explicitly developing specialised neural network architectures.
ii.4 Chemical insights from electronic deep learning
Recently, a lot of research has focused on explaining predictions of ML models Bach et al. (2015); Montavon et al. (2018); Kindermans et al. (2018) aiming both at the validation of the model Kim et al. (2017); Lapuschkin et al. (2019) as well as the extraction of scientific insight De et al. (2016); Schütt et al. (2017); Jha et al. (2018). However, these methods explain ML predictions either in terms of the input space, atom types and positions in this case, or latent features such as local chemical potentials Schütt et al. (2017, 2018b). In quantum chemistry however, it is more common to analyse electronic properties in terms of the MOs and properties derived from the electronic wavefunction, which are direct output quantities of the SchNOrb architecture.
Molecular orbitals encode the distribution of electrons in a molecule, thus offering direct insights into its underlying electronic structure. They form the basis for a wealth of chemical bonding analysis schemes, bridging the gap between quantum mechanics and abstract chemical concepts, such as bond orders and atomic partial charges Cramer (2004). These quantities are invaluable tools in understanding and interpreting chemical processes based on molecular reactivity and chemical bonding strength. As SchNOrb yields the MOs, we are able to apply population analysis to our ML predictions. Fig. 2d shows Loewdin partial atomic charges and bond orders for the uracil molecule. Loewdin charges provide a chemically intuitive measure for the electron distribution and can e.g. aid in identifying potential nucleophilic or electrophilic reaction sites in a molecule. The negatively charged carbonyl oxygens in uracil, for example, are involved in forming RNA base pairs. The corresponding bond orders provide information on the connectivity and types of bonds between atoms. In the case of uracil, the two double bonds of the carbonyl groups are easily recognizable (bond order 2.12 and 2.14, respectively). However, it is also possible to identify electron delocalisation effects in the pyrimidine ring, where the carbon double bond donates electron density to its neighbors. A population analysis for malondialdehyde, as well as population prediction errors for all molecules can be found in Supplementary Fig. S4. and Table S3.
The SchNOrb architecture enables an accurate prediction of the electronic structure across molecular configuration space, which provides for rich chemical interpretation during molecular reaction dynamics. Fig. 3a shows an excerpt of a molecular dynamics simulation of malondialdehyde that was driven by atomic forces predicted using SchNOrb. It depicts the proton transfer together with the relevant MOs and the electronic density. The density paints an intuitive picture of the reaction as it migrates along with the hydrogen. This exchange of electron density during proton transfer is also reflected in the orbitals. Their dynamical rearrangement indicates an alternation between single and double bonds. The latter effect is hard to recognise based on the density alone and demonstrates the wealth of information encoded in the molecular wavefunctions.
Fig. 3b depicts the forces the different MOs exert onto the hydrogen atom exchanged during the proton transfer. All forces are projected onto the reaction coordinate, where positive values correspond to a force driving the proton towards the product state. In the initial configuration I, most forces lead to attraction of the hydrogen atom to the right oxygen. In the intermediate configuration II, orbital rearrangement results in a situation where the majority of orbitals forces on the hydrogen atom become minimal, representing mostly non-bonding character between oxygens and hydrogen. One exception is MO 13, depicted in the inset of Fig. 3b. Due to a minor deviation from a symmetric O-H-O arrangement, the orbital represents a one-sided O-H bond, exerting forces that promote the reaction. The intrinsic fluctuations during the proton transfer molecular dynamics are captured by the MOs as can be seen in Fig. 3c. This shows the distribution of orbital energies encountered during the reaction. As would be expected, both HOMO-2 and HOMO-3 (inset, orange and blue respectively), which strongly participate in the proton transfer, show significantly broadened peaks due to strong energy variations in the dynamics. This example nicely shows the chemically intuitive interpretation that can be obtained by the electronic structure prediction of SchNOrb.
ii.5 Deep learning-enhanced quantum chemistry
An essential paradigm of chemistry is that the molecular structure defines chemical properties. Inverse chemical design turns this paradigm on its head by enabling property-driven chemical structure exploration. The SchNOrb framework constitutes a suitable tool to enable inverse chemical design due to its analytic representation of electronic structure in terms of the atomic positions. We can therefore obtain analytic derivatives with respect to the atomic positions, which provide the ability to optimise electronic properties. Fig. S5a shows the minimisation and maximisation of the HOMO-LUMO gap of malondialdehyde as an example. We perform gradient descent and ascent from a randomly selected configuration until convergence at and , respectively. We are able to identify structures which minimise and maximise the gap from its initial eV to eV at and eV at . While in this proof of concept these changes were predominantly caused by local deformations in the carbon-carbon bonds indicated in Fig. S5a, they present an encouraging prospect how electronic surrogate models such as SchNOrb can contribute to computational chemical design using more sophisticated optimisation methods, such as alchemical derivatives To Baben et al. (2016)
You et al. (2018).ML applications in chemistry have traditionally been one-directional, i.e. ML models are built on quantum chemistry data. Models such as SchNOrb offer the prospect of providing a deeper integration and feedback loop of ML with quantum chemistry methods. SchNOrb directly predicts wavefunctions based on quantum chemistry data, which in turn, can serve as input for further quantum chemical calculations. For example, in the context of HF or DFT calculations, the relevant equations are solved via a self-consistent field approach (SCF) that determines a set of MOs. The convergence with respect to SCF iteration steps largely determines the computational speed of an electronic structure calculation and strongly depends on the quality of the initialisation for the wavefunction. The coefficients predicted by SchNOrb can serve as such an initialisation of SCF calculations. Fig. S5b depicts the SCF convergence for three sets of computations on the uracil molecule: using the standard initialisation techniques of quantum chemistry codes, and the SchNorb coefficients with or without a second order solver. Nominally, only small improvements are observed using SchNorb coefficients in combination with a conventional SCF solver. This is due to the various strategies employed in electronic structure codes in order to provide a numerically robust SCF procedure. Only by performing SCF calculations with a second order solver, which would not converge using a less accurate starting point than our SchNorb MO coefficients, the efficiency of our combined ML and second order SCF approach becomes apparent. Convergence is obtained in only a fraction of the original time, reducing the number of cycles by . Similarly, Supplementary Fig. S5 shows the reduction of SCF iteration by for malondialdehyde. It should be noted, that this significantly faster, combined approach does not introduce any approximations into the electronic structure method itself and yields exactly the same results as the full computation. Another example of integration of the SchNOrb deep learning framework with quantum chemistry is the use of predicted wavefunctions and MO energies based on Hartree–Fock as starting point for post-Hartree–Fock correlation methods such as Møller-Plesset perturbation theory (MP2). Supplementary Table S5 presents the mean absolute error of an MP2 calculation for ethanol based on wavefunctions predicted from SchNOrb. The associated prediction error for the test set is 83 meV. Compared to the overall HF and MP2 energy, the relative error of SchNOrb amounts to 0.01 % and 0.06 %, respectively. For the MP2 correlation energy, we observe a deviation of 17 %, the reason of which is inclusion of virtual orbitals in the calculation of the MP2 integrals. However, even in this case, the total error only amounts to a deviation of 93 meV.
Iii Discussion
The SchNOrb framework provides an analytical expression for the electronic wavefunctions in a local atomic orbital representation as a function of molecular composition and atom positions. The rotational covariance of atomic orbital interactions is encoded by basis functions with angular momentum which constitutes significant progress over previous work Hegde and Bowen (2017). As a consequence, the model provides access to atomic derivatives of wavefunctions, which include molecular orbital energy derivatives, Hamiltonian derivatives, which can be used to approximate nonadiabatic couplings Maurer et al. (2016), as well as higher order derivatives that describe the electron-nuclear response of the molecule. Thus, the SchNOrb framework preserves the benefits of interatomic potentials while enabling access to the electronic structure as predicted by quantum chemistry methods.
SchNOrb opens up completely new applications to ML-enhanced molecular simulation. This includes the construction of interatomic potentials with electronic properties that can facilitate efficient photochemical simulations during surface hopping trajectory dynamics or Ehrenfest-type mean-field simulations, but also enables the development of new ML-enhanced approaches to inverse molecular design via electronic property optimisation. On the basis of the SchNOrb framework, intelligent preprocessing of quantum chemistry data in the form of effective Hamiltonians or optimised minimal basis representations Lu et al. (2004) can be developed in the future. Such preprocessing will also pave the way towards the prediction of the electronic structure based on post-HF correlated wavefunction methods and post-DFT quasiparticle methods.
This work serves as a first proof of principle that direct ML models of electronic structure based on quantum chemistry can be constructed and used to enhance further quantum chemistry calculations. We have presented an immediate consequence of this by reducing the number of DFT-SCF iterations with wavefunctions predicted via SchNOrb. The presented model delivers derived electronic properties that can be formulated as quantum mechanical expectation values. This provides an important step towards a full integration of ML and quantum chemistry into the scientific discovery cycle.
Iv Methods
iv.1 Reference data
All reference calculations were carried out with the ORCA quantum chemistry code Neese (2012) using the def2-SVP basis set Weigend and Ahlrichs (2005). Integration grid levels of 4 and 5 were employed during SCF iterations and the final computation of properties, respectively. Unless stated otherwise, the default ORCA SCF procedure was used, which is based on the Pulay method Pulay (1980). For the remaining cases, the Newton–Raphson procedure implemented in ORCA was employed as a second order SCF solver. SCF convergence criteria were set to VeryTight. DFT calculations were carried out using the PBE functional Perdew et al. (1996b). For ethanol, additional HF computations were performed.
Molecular dynamics simulations for malondialdehyde were carried out with SchNetPack Schütt et al. (2018c). The equations of motions were integrated using a timestep of 0.5 fs. Simulation temperatures were kept at 300 K with a Langevin thermostat Bussi and Parrinello (2007) employing a time constant of 100 fs. Trajectories were propagated for a total of 50 ps, of which the first 10 ps were discarded.
iv.2 Details on the neural network architecture
In the following we describe the neural network depicted in Fig. 2 in detail.
iv.2.1 Basic building blocks and notation
We use shifted softplus activation functions
(7) |
throughout the architecture. Linear layers are written as
(8) |
with input , weights and bias . Fully-connected neural networks with one hidden layer are written as
(9) |
with weights and and biases accordingly.
Model parameters are shared within layers across atoms and interactions, but never across layers. We omit layer indices for clarity.
iv.2.2 Atomic environments
The representations of atomic environments are constructed with the neural network structure as in SchNet. In the following, we summarise this first part of the model. For further details, please refer to Schütt et al. (2018a).
First, each atom is assigned an initial element-specific embedding
(10) |
where is the nuclear charge and is the number of atom-wise features. In this work, we use for all models. The representations are refined using SchNet interaction layers (Fig. 2b). The main component is a continuous-filter convolutional layer (cfconv)
(11) |
which takes a spatial filter
(12) |
with
(13) | ||||
(14) |
where is the cutoff radius and
is the grid spacing of the radial basis function expansion of interatomic distance
. While this adds spatial information to the environment representations for each feature separately, the crosstalk between features is performed atom-wise by fully-connected layers (linear and in Fig. 2b) to obtain the refinements , where is the current interaction iteration. The refined atom representations are then(15) |
These representations of atomic environments are employed by SchNet to predict chemical properties via atom-wise contributions. However, in order to extend this scheme to the prediction of the Hamiltonian, we need to construct representations of pair-wise environments in a second interaction phase.
iv.2.3 Pair-wise environments and angular momenta
The Hamiltonian matrix is of the form
(16) |
where a matrix block depends on the atoms within their chemical environment as well as on the choice of and atomic orbitals, respectively. Therefore, SchNOrb builds representations of these embedded atom pairs based on the previously constructed representations of atomic environments. This is achieved through the SchNOrb interaction module (see Fig. 2c).
First, a raw representation of atom pairs is obtained using a factorised tensor layer Schütt et al. (2017); Sutskever et al. (2011):
(17) | ||||
The layers map the atom representations to the factors, while the filter-generating network is defined analogously to Eq. 12 and directly maps to the factor space. In analogy to how the SchNet interactions are used to build atomic environments, SchNOrb interactions are applied several times, where each instance further refines the atomic environments of the atom pairs with additive corrections
(18) | ||||
(19) |
as well as constructs pair-wise features:
(20) | ||||
(21) | ||||
(22) |
where models the direct interactions of atoms while models the interactions of the pair with neighboring atoms. As described above, the atom pair coefficients are used to form a basis set
where corresponds to the angular momentum channel and are learnable parameters to project along directions. For all results in this work, we used . For interactions between s-orbitals, we consider the special case where the features along all directions are equal due to rotational invariance. At this point, is rotationally invariant and is covariant. On this basis, we obtain features with higher angular momenta using:
where features possess angular momentum . The SchNOrb representations of atom pairs embedded in their chemical environment, that were constructed from the previously constructed SchNet atom-wise features, will serve in a next step to predict the corresponding blocks of the ground-state Hamiltonian.
iv.2.4 Predicting the target properties
Finally, we assemble the Hamiltonian and overlap matrices to be predicted. Each atom pair block is predicted from the corresponding features :
where we restrict the network to linear layers in order to conserve the angular momenta:
with , i.e. mapping to the maximal number of atomic orbitals in the data. Then, a mask is applied to the matrix block to yield only . Finally, we symmetrise the prediced Hamiltonian:
(23) |
The overlap matrix is obtained similarly with blocks
The prediction of the total energy is obtained anologously to SchNet as a sum over atom-wise energy contributions:
iv.3 Data augmentation
While SchNOrb constructs features and
with angular momenta such that the Hamiltonian matrix can be represented as a linear combination of those, it does not encode the full rotational symmetry a priori. However, this can be learned by SchNOrb assuming the training data reflects enough rotations of a molecule. To save computing power, we reduce the amount of reference calculations by randomly rotating configurations before each training epoch using Wigner
rotation matrices. Schober et al. (2016) Given a randomly sampled rotor , the applied transformations are(24) | ||||
(25) | ||||
(26) | ||||
(27) |
for atom positions , atomic forces , Hamiltonian matrix , and overlap .
iv.4 Neural network training
For the training, we used a combined loss to train on energies , atomic forces , Hamiltonian and overlap matrices simultaneously:
(28) |
where the variables marked with a tilde refer to the corresponding predictions. The neural networks were trained with stochastic gradient descent using the ADAM optimiser
Kingma and Ba (2014). We reduced the learning rate using a decay factor of after epochs without improvement of the validation loss. The training is stopped at . The mini-batch sizes, patience and data set sizes are listed in Supplementary Table S1. Afterwards, the model with lowest validation error is selected for testing.Acknowledgements.
All authors gratefully acknowledge support by the Institute of Pure and Applied Mathematics (IPAM) at the University of California Los Angeles during a long program workshop. RJM acknowledges funding through a UKRI Future Leaders Fellowship (MR/S016023/1). KTS and KRM acknowledge support by the Federal Ministry of Education and Research (BMBF) for the Berlin Center for Machine Learning (01IS18037A). This project has received funding from the European Unions Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No. 792572. Computing resources have been provided by the Scientific Computing Research Technology Platform of the University of Warwick, and the EPSRC-funded high end computing Materials Chemistry Consortium (EP/R029431/1). KRM acknowledges partial financial support by the German Ministry for Education and Research (BMBF) under Grants 01IS14013A-E, 01GQ1115 and 01GQ0850; Deutsche Forschungsgesellschaft (DFG) under Grant Math+, EXC 2046/1, Project ID 390685689 and by the Technology Promotion (IITP) grant funded by the Korea government (No. 2017-0-00451, No. 2017-0-01779). Correspondence to RJM, KRM and AT.References
- Behler and Parrinello (2007) J. Behler and M. Parrinello, Phys. Rev. Lett. 98, 146401 (2007).
- Braams and Bowman (2009) B. J. Braams and J. M. Bowman, International Reviews in Physical Chemistry 28, 577 (2009).
- Bartók et al. (2010) A. P. Bartók, M. C. Payne, R. Kondor, and G. Csányi, Phys. Rev. Lett. 104, 136403 (2010).
- Smith et al. (2017) J. S. Smith, O. Isayev, and A. E. Roitberg, Chem. Sci. 8, 3192 (2017).
- Podryabinkin and Shapeev (2017) E. V. Podryabinkin and A. V. Shapeev, Computational Materials Science 140, 171 (2017).
- Podryabinkin et al. (2019) E. V. Podryabinkin, E. V. Tikhonov, A. V. Shapeev, and A. R. Oganov, Physical Review B 99, 064114 (2019).
- Chmiela et al. (2017) S. Chmiela, A. Tkatchenko, H. E. Sauceda, I. Poltavsky, K. T. Schütt, and K.-R. Müller, Sci. Adv. 3, e1603015 (2017).
- Chmiela et al. (2018) S. Chmiela, H. E. Sauceda, K.-R. Müller, and A. Tkatchenko, Nature communications 9, 3887 (2018).
- Ryczko et al. (2018) K. Ryczko, D. Strubbe, and I. Tamblyn, (2018), arXiv:1811.08928 .
- Brockherde et al. (2017) F. Brockherde, L. Voigt, L. Li, M. E. Tuckerman, K. Burke, and K.-R. Müller, Nat. Commun. 8, 872 (2017).
- Wilkins et al. (2019) D. M. Wilkins, A. Grisafi, Y. Yang, K. U. Lao, R. A. DiStasio, and M. Ceriotti, Proceedings of the National Academy of Sciences of the United States of America 116, 3401 (2019).
- Gastegger et al. (2017) M. Gastegger, J. Behler, and P. Marquetand, Chem. Sci. 8, 6924 (2017).
- Rupp et al. (2012) M. Rupp, A. Tkatchenko, K.-R. Müller, and O. A. Von Lilienfeld, Phys. Rev. Lett. 108, 058301 (2012).
- Eickenberg et al. (2017) M. Eickenberg, G. Exarchakis, M. Hirn, and S. Mallat, in Advances in Neural Information Processing Systems 30 (Curran Associates, Inc., 2017) pp. 6543–6552.
- von Lilienfeld (2018) O. A. von Lilienfeld, Angewandte Chemie International Edition 57, 4164 (2018).
- (16) J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, in Proceedings of the 34th International Conference on Machine Learning, pp. 1263–1272.
- Jha et al. (2018) D. Jha, L. Ward, A. Paul, W.-k. Liao, A. Choudhary, C. Wolverton, and A. Agrawal, Scientific reports 8, 17593 (2018).
- Kitchin (2018) J. R. Kitchin, Nature Catalysis 1, 230 (2018).
- Maryasin et al. (2018) B. Maryasin, P. Marquetand, and N. Maulide, Angewandte Chemie International Edition 57, 6978 (2018).
- Li et al. (2018) H. Li, C. Collins, M. Tanha, G. J. Gordon, and D. J. Yaron, Journal of Chemical Theory and Computation 14, 5764 (2018), pMID: 30351008.
- Hegde and Bowen (2017) G. Hegde and R. C. Bowen, Scientific reports 7, 42669 (2017).
- Grisafi et al. (2018) A. Grisafi, D. M. Wilkins, G. Csányi, and M. Ceriotti, Physical Review Letters 120, 036002 (2018).
- Thomas et al. (2018) N. Thomas, T. Smidt, S. Kearnes, L. Yang, L. Li, K. Kohlhoff, and P. Riley, (2018), arXiv:1802.08219 .
- Schütt et al. (2018a) K. T. Schütt, H. E. Sauceda, P.-J. Kindermans, A. Tkatchenko, and K.-R. Müller, The Journal of Chemical Physics 148, 241722 (2018a).
- Cramer (2004) C. J. Cramer, Essentials of computational chemistry: theories and models (John Wiley & Sons, 2004).
- Schütt et al. (2017) K. T. Schütt, F. Arbabzadah, S. Chmiela, K.-R. Müller, and A. Tkatchenko, Nat. Commun. 8, 13890 (2017).
- Schütt et al. (2017) K. T. Schütt, P.-J. Kindermans, H. E. Sauceda, S. Chmiela, A. Tkatchenko, and K.-R. Müller, in Advances in Neural Information Processing Systems 30 (2017) pp. 992–1002.
- Perdew et al. (1996a) J. P. Perdew, K. Burke, and M. Ernzerhof, Phys. Rev. Lett. 77, 3865 (1996a).
- Bach et al. (2015) S. Bach, A. Binder, G. Montavon, F. Klauschen, K.-R. Müller, and W. Samek, PloS one 10, e0130140 (2015).
- Montavon et al. (2018) G. Montavon, W. Samek, and K.-R. Müller, Digital Signal Processing 73, 1 (2018).
- Kindermans et al. (2018) P.-J. Kindermans, K. T. Schütt, M. Alber, K.-R. Müller, D. Erhan, B. Kim, and S. Dähne, in International Conference on Learning Representations (ICLR) (2018).
- Kim et al. (2017) B. Kim, M. Wattenberg, J. Gilmer, C. Cai, J. Wexler, F. Viegas, and R. Sayres, arXiv preprint arXiv:1711.11279 (2017).
- Lapuschkin et al. (2019) S. Lapuschkin, S. Wäldchen, A. Binder, G. Montavon, W. Samek, and K.-R. Müller, Nature communications 10, 1096 (2019).
- De et al. (2016) S. De, A. P. Bartók, G. Csányi, and M. Ceriotti, Physical Chemistry Chemical Physics 18, 13754 (2016).
- Schütt et al. (2018b) K. T. Schütt, M. Gastegger, A. Tkatchenko, and K.-R. Müller, arXiv preprint arXiv:1806.10349 (2018b).
- To Baben et al. (2016) M. To Baben, J. Achenbach, and O. Von Lilienfeld, The Journal of chemical physics 144, 104103 (2016).
- You et al. (2018) J. You, B. Liu, Z. Ying, V. Pande, and J. Leskovec, in Advances in Neural Information Processing Systems (2018) pp. 6410–6421.
- Maurer et al. (2016) R. J. Maurer, M. Askerka, V. S. Batista, and J. C. Tully, Phys. Rev. B 94, 115432 (2016).
- Lu et al. (2004) W. C. Lu, C. Z. Wang, M. W. Schmidt, L. Bytautas, K. M. Ho, and K. Ruedenberg, The Journal of chemical physics 120, 2629 (2004).
- Neese (2012) F. Neese, WIREs Comput. Mol. Sci. 2, 73 (2012).
- Weigend and Ahlrichs (2005) F. Weigend and R. Ahlrichs, Phys. Chem. Chem. Phys. 7, 3297 (2005).
- Pulay (1980) P. Pulay, Chemical Physics Letters 73, 393 (1980).
- Perdew et al. (1996b) J. P. Perdew, K. Burke, and M. Ernzerhof, Phys. Rev. Lett. 77, 3865 (1996b).
- Schütt et al. (2018c) K. Schütt, P. Kessel, M. Gastegger, K. Nicoli, A. Tkatchenko, and K.-R. Müller, Journal of chemical theory and computation 15, 448 (2018c).
- Bussi and Parrinello (2007) G. Bussi and M. Parrinello, Phys. Rev. E 75, 056707 (2007).
- Sutskever et al. (2011) I. Sutskever, J. Martens, and G. E. Hinton, in Proceedings of the 28th International Conference on Machine Learning (2011) pp. 1017–1024.
- Schober et al. (2016) C. Schober, K. Reuter, and H. Oberhofer, The Journal of Chemical Physics 144, 054103 (2016).
- Kingma and Ba (2014) D. P. Kingma and J. Ba, arXiv preprint arXiv:1412.6980 (2014).
Supplementary Material
Data set | n | n | n | n | lr | |
---|---|---|---|---|---|---|
HO | 500 | 500 | 4000 | 32 | 50 | |
Ethanol (HF / DFT) | 25000 | 500 | 4500 | 32 | 15 | |
Malondialdehyde | 25000 | 500 | 1478 | 32 | 15 | |
Uracil | 25000 | 500 | 4500 | 48 | 15 |
Data set | level of theory | H [eV] | S | [eV] | total energy [eV] | |
---|---|---|---|---|---|---|
HO | DFT | 0.0045 | 7.91e-05 | 0.0076 | 1.00 | 0.001435 |
Ethanol | DFT | 0.0051 | 6.78e-05 | 0.0091 | 1.00 | 0.000361 |
Ethanol | HF | 0.0079 | 7.50e-05 | 0.0106 | 1.00 | 0.000378 |
Malondialdehyde | DFT | 0.0052 | 6.73e-05 | 0.0109 | 0.99 | 0.000353 |
Uracil | DFT | 0.0062 | 8.24e-05 | 0.0479 | 0.90 | 0.000848 |
Dataset | level of theory | Partial Charges | Bond Orders |
---|---|---|---|
Water | DFT | 0.002 | 0.001 |
Ethanol | DFT | 0.002 | 0.001 |
Ethanol | HF | 0.002 | 0.001 |
Malondialdehyde | DFT | 0.004 | 0.001 |
Uracil | DFT | 0.047 | 0.006 |
Dataset | level of theory | Dipole moment [D] | Quadrupole Moment [D Å] |
---|---|---|---|
Water | DFT | 0.0072 | 0.0061 |
Ethanol | DFT | 0.0262 | 0.0355 |
Ethanol | HF | 0.0161 | 0.0218 |
Malondialdehyde | DFT | 0.0536 | 0.0575 |
Uracil | DFT | 1.2762 | 1.8703 |
Energy component | MAE [eV] | MAE [%] |
---|---|---|
HF | 0.0214 | 0.01 |
MP2 correlation | 0.0831 | 17.19 |
HF + MP2 | 0.0926 | 0.06 |