
Residual Networks as Flows of Velocity Fields for Diffeomorphic Time Series Alignment
Nonlinear (large) time warping is a challenging source of nuisance in t...
read it

Do ideas have shape? Plato's theory of forms as the continuous limit of artificial neural networks
We show that ResNets converge, in the infinite depth limit, to a general...
read it

Data Driven Governing Equations Approximation Using Deep Neural Networks
We present a numerical framework for approximating unknown governing equ...
read it

Discrete geodesic calculus in the space of viscous fluidic objects
Based on a local approximation of the Riemannian distance on a manifold ...
read it

A Geometric Framework for Stochastic Shape Analysis
We introduce a stochastic model of diffeomorphisms, whose action on a va...
read it

A Flow Model of Neural Networks
Based on a natural connection between ResNet and transport equation or i...
read it

Semidiscrete optimization through semidiscrete optimal transport: a framework for neural architecture search
In this paper we introduce a theoretical framework for semidiscrete opt...
read it
ResNetLDDMM: Advancing the LDDMM Framework Using Deep Residual Networks
In deformable registration, the geometric framework  large deformation diffeomorphic metric mapping or LDDMM, in short  has inspired numerous techniques for comparing, deforming, averaging and analyzing shapes or images. Grounded in flows, which are akin to the equations of motion used in fluid dynamics, LDDMM algorithms solve the flow equation in the space of plausible deformations, i.e. diffeomorphisms. In this work, we make use of deep residual neural networks to solve the nonstationary ODE (flow equation) based on a Euler's discretization scheme. The central idea is to represent timedependent velocity fields as fully connected ReLU neural networks (building blocks) and derive optimal weights by minimizing a regularized loss function. Computing minimizing paths between deformations, thus between shapes, turns to find optimal network parameters by backpropagating over the intermediate building blocks. Geometrically, at each time step, ResNetLDDMM searches for an optimal partition of the space into multiple polytopes, and then computes optimal velocity vectors as affine transformations on each of these polytopes. As a result, different parts of the shape, even if they are close (such as two fingers of a hand), can be made to belong to different polytopes, and therefore be moved in different directions without costing too much energy. Importantly, we show how diffeomorphic transformations, or more precisely bilipshitz transformations, are predicted by our algorithm. We illustrate these ideas on diverse registration problems of 3D shapes under complex topologypreserving transformations. We thus provide essential foundations for more advanced shape variability analysis under a novel joint geometricneural networks Riemannianlike framework, i.e. ResNetLDDMM.
READ FULL TEXT
Comments
There are no comments yet.