
Wavelet Scattering Regression of Quantum Chemical Energies
We introduce multiscale invariant dictionaries to estimate quantum chemical energies of organic molecules, from training databases. Molecular energies are invariant to isometric atomic displacements, and are Lipschitz continuous to molecular deformations. Similarly to density functional theory (DFT), the molecule is represented by an electronic density function. A multiscale invariant dictionary is calculated with wavelet scattering invariants. It cascades a first wavelet transform which separates scales, with a second wavelet transform which computes interactions across scales. Sparse scattering regressions give state of the art results over two databases of organic planar molecules. On these databases, the regression error is of the order of the error produced by DFT codes, but at a fraction of the computational cost.
05/16/2016 ∙ by Matthew Hirn, et al. ∙ 0 ∙ shareread it

Quantum Energy Regression using Scattering Transforms
We present a novel approach to the regression of quantum mechanical energies based on a scattering transform of an intermediate electron density representation. A scattering transform is a deep convolution network computed with a cascade of multiscale wavelet transforms. It possesses appropriate invariant and stability properties for quantum energy regression. This new framework removes fundamental limitations of Coulomb matrix based energy regressions, and numerical experiments give stateoftheart accuracy over planar molecules.
02/06/2015 ∙ by Matthew Hirn, et al. ∙ 0 ∙ shareread it

Structural Risk Minimization for C^1,1(R^d) Regression
One means of fitting functions to highdimensional data is by providing smoothness constraints. Recently, the following smooth function approximation problem was proposed by herbert2014computing: given a finite set E ⊂R^d and a function f: E →R, interpolate the given information with a function f∈Ċ^1, 1(R^d) (the class of firstorder differentiable functions with Lipschitz gradients) such that f(a) = f(a) for all a ∈ E, and the value of Lip(∇f) is minimal. An algorithm is provided that constructs such an approximating function f and estimates the optimal Lipschitz constant Lip(∇f) in the noiseless setting. We address statistical aspects of reconstructing the approximating function f from a closelyrelated class C^1, 1(R^d) given samples from noisy data. We observe independent and identically distributed samples y(a) = f(a) + ξ(a) for a ∈ E, where ξ(a) is a noise term and the set E ⊂R^d is fixed and known. We obtain uniform bounds relating the empirical risk and true risk over the class F_M = {f ∈ C^1, 1(R^d) Lip(∇ f) ≤M}, where the quantity M grows with the number of samples at a rate governed by the metric entropy of the class C^1, 1(R^d). Finally, we provide an implementation using Vaidya's algorithm, supporting our results via numerical experiments on simulated data.
03/29/2018 ∙ by Adam Gustafson, et al. ∙ 0 ∙ shareread it

Graph Classification with Geometric Scattering
One of the most notable contributions of deep learning is the application of convolutional neural networks (ConvNets) to structured signal classification, and in particular image classification. Beyond their impressive performances in supervised learning, the structure of such networks inspired the development of deep filter banks referred to as scattering transforms. These transforms apply a cascade of wavelet transforms and complex modulus operators to extract features that are invariant to group operations and stable to deformations. Furthermore, ConvNets inspired recent advances in geometric deep learning, which aim to generalize these networks to graph data by applying notions from graph signal processing to learn deep graph filter cascades. We further advance these lines of research by proposing a geometric scattering transform using graph wavelets defined in terms of random walks on the graph. We demonstrate the utility of features extracted with this designed deep filter bank in graph classification, and show its competitive performance relative to other methods, including graph kernel methods and geometric deep learning ones, on both social and biochemistry data.
10/07/2018 ∙ by Feng Gao, et al. ∙ 0 ∙ shareread it

Geometric Scattering on Manifolds
We present a mathematical model for geometric deep learning based upon a scattering transform defined over manifolds, which generalizes the wavelet scattering transform of Mallat. This geometric scattering transform is (locally) invariant to isometry group actions, and we conjecture that it is stable to actions of the diffeomorphism group.
12/15/2018 ∙ by Michael Perlmutter, et al. ∙ 0 ∙ shareread it

Solid Harmonic Wavelet Scattering for Predictions of Molecule Properties
We present a machine learning algorithm for the prediction of molecule properties inspired by ideas from density functional theory. Using Gaussiantype orbital functions, we create surrogate electronic densities of the molecule from which we compute invariant "solid harmonic scattering coefficients" that account for different types of interactions at different scales. Multilinear regressions of various physical properties of molecules are computed from these invariant coefficients. Numerical experiments show that these regressions have near state of the art performance, even with relatively few training examples. Predictions over small sets of scattering coefficients can reach a DFT precision while being interpretable.
05/01/2018 ∙ by Michael Eickenberg, et al. ∙ 0 ∙ shareread it

Scattering Statistics of Generalized Spatial Poisson Point Processes
We present a machine learning model for the analysis of randomly generated discrete signals, which we model as the points of a homogeneous or inhomogeneous, compound Poisson point process. Like the wavelet scattering transform introduced by S. Mallat, our construction is a mathematical model of convolutional neural networks and is naturally invariant to translations and reflections. Our model replaces wavelets with Gabortype measurements and therefore decouples the roles of scale and frequency. We show that, with suitably chosen nonlinearities, our measurements distinguish Poisson point processes from common selfsimilar processes, and separate different types of Poisson point processes based on the first and second moments of the arrival intensity λ(t), as well as the absolute moments of the charges associated to each point.
02/10/2019 ∙ by Michael Perlmutter, et al. ∙ 0 ∙ shareread it

Geometric Wavelet Scattering Networks on Compact Riemannian Manifolds
The Euclidean scattering transform was introduced nearly a decade ago to improve the mathematical understanding of convolutional neural networks. Inspired by recent interest in geometric deep learning, which aims to generalize convolutional neural networks to manifold and graphstructured domains, we define a geometric scattering transform on manifolds. Similar to the Euclidean scattering transform, the geometric scattering transform is based on a cascade of wavelet filters and pointwise nonlinearities. It is invariant to local isometries and stable to certain types of diffeomorphisms. Empirical results demonstrate its utility on several geometric learning tasks. Our results generalize the deformation stability and local translation invariance of Euclidean scattering, and demonstrate the importance of linking the used filter structures to the underlying geometry of the data.
05/24/2019 ∙ by Michael Perlmutter, et al. ∙ 0 ∙ shareread it
Matthew Hirn
is this you? claim profile