Analysis of (sub-)Riemannian PDE-G-CNNs

10/03/2022
by   Gijs Bellaard, et al.
0

Group equivariant convolutional neural networks (G-CNNs) have been successfully applied in geometric deep-learning. Typically, G-CNNs have the advantage over CNNs that they do not waste network capacity on training symmetries that should have been hard-coded in the network. The recently introduced framework of PDE-based G-CNNs (PDE-G-CNNs) generalize G-CNNs. PDE-G-CNNs have the core advantages that they simultaneously 1) reduce network complexity, 2) increase classification performance, 3) provide geometric network interpretability. Their implementations solely consist of linear and morphological convolutions with kernels. In this paper we show that the previously suggested approximative morphological kernels do not always approximate the exact kernels accurately. More specifically, depending on the spatial anisotropy of the Riemannian metric, we argue that one must resort to sub-Riemannian approximations. We solve this problem by providing a new approximative kernel that works regardless of the anisotropy. We provide new theorems with better error estimates of the approximative kernels, and prove that they all carry the same reflectional symmetries as the exact ones. We test the effectiveness of multiple approximative kernels within the PDE-G-CNN framework on two datasets, and observe an improvement with the new approximative kernel. We report that the PDE-G-CNNs again allow for a considerable reduction of network complexity while having a comparable or better performance than G-CNNs and CNNs on the two datasets. Moreover, PDE-G-CNNs have the advantage of better geometric interpretability over G-CNNs, as the morphological kernels are related to association fields from neurogeometry.

READ FULL TEXT

page 4

page 5

page 7

page 8

page 13

research
01/24/2020

PDE-based Group Equivariant Convolutional Neural Networks

We present a PDE-based framework that generalizes Group equivariant Conv...
research
12/12/2022

Implicit Neural Convolutional Kernels for Steerable CNNs

Steerable convolutional neural networks (CNNs) provide a general framewo...
research
09/04/2018

Geometric Operator Convolutional Neural Network

The Convolutional Neural Network (CNN) has been successfully applied in ...
research
11/05/2018

A General Theory of Equivariant CNNs on Homogeneous Spaces

Group equivariant convolutional neural networks (G-CNNs) have recently e...
research
06/21/2022

Scaling up Kernels in 3D CNNs

Recent advances in 2D CNNs and vision transformers (ViTs) reveal that la...
research
01/22/2017

Optimization on Product Submanifolds of Convolution Kernels

Recent advances in optimization methods used for training convolutional ...
research
11/18/2019

Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity

Kernel sparsity ("dying ReLUs") and lack of diversity are commonly obser...

Please sign up or login with your details

Forgot password? Click here to reset