A Structural Approach to the Design of Domain Specific Neural Network Architectures

01/23/2023
by   Gerrit Nolte, et al.
0

This is a master's thesis concerning the theoretical ideas of geometric deep learning. Geometric deep learning aims to provide a structured characterization of neural network architectures, specifically focused on the ideas of invariance and equivariance of data with respect to given transformations. This thesis aims to provide a theoretical evaluation of geometric deep learning, compiling theoretical results that characterize the properties of invariant neural networks with respect to learning performance.

READ FULL TEXT

page 20

page 21

page 34

page 36

research
09/15/2023

Unveiling Invariances via Neural Network Pruning

Invariance describes transformations that do not alter data's underlying...
research
12/23/2021

Revisiting Transformation Invariant Geometric Deep Learning: Are Initial Representations All You Need?

Geometric deep learning, i.e., designing neural networks to handle the u...
research
12/13/2017

Mathematics of Deep Learning

Recently there has been a dramatic increase in the performance of recogn...
research
06/18/2021

Training or Architecture? How to Incorporate Invariance in Neural Networks

Many applications require the robustness, or ideally the invariance, of ...
research
09/24/2022

A Simple Strategy to Provable Invariance via Orbit Mapping

Many applications require robustness, or ideally invariance, of neural n...
research
05/24/2023

Deep Equivariant Hyperspheres

This paper presents an approach to learning nD features equivariant unde...
research
01/27/2015

maxDNN: An Efficient Convolution Kernel for Deep Learning with Maxwell GPUs

This paper describes maxDNN, a computationally efficient convolution ker...

Please sign up or login with your details

Forgot password? Click here to reset