Unveiling Invariances via Neural Network Pruning

09/15/2023
by   Derek Xu, et al.
0

Invariance describes transformations that do not alter data's underlying semantics. Neural networks that preserve natural invariance capture good inductive biases and achieve superior performance. Hence, modern networks are handcrafted to handle well-known invariances (ex. translations). We propose a framework to learn novel network architectures that capture data-dependent invariances via pruning. Our learned architectures consistently outperform dense neural networks on both vision and tabular datasets in both efficiency and effectiveness. We demonstrate our framework on multiple deep learning models across 3 vision and 40 tabular datasets.

READ FULL TEXT

page 6

page 10

research
01/23/2023

A Structural Approach to the Design of Domain Specific Neural Network Architectures

This is a master's thesis concerning the theoretical ideas of geometric ...
research
06/25/2020

Data-dependent Pruning to find the Winning Lottery Ticket

The Lottery Ticket Hypothesis postulates that a freshly initialized neur...
research
08/21/2017

Deep vs. Diverse Architectures for Classification Problems

This study compares various superlearner and deep learning architectures...
research
07/17/2022

HyperInvariances: Amortizing Invariance Learning

Providing invariances in a given learning task conveys a key inductive b...
research
08/28/2021

ThresholdNet: Pruning Tool for Densely Connected Convolutional Networks

Deep neural networks have made significant progress in the field of comp...
research
07/05/2023

GIT: Detecting Uncertainty, Out-Of-Distribution and Adversarial Samples using Gradients and Invariance Transformations

Deep neural networks tend to make overconfident predictions and often re...
research
02/22/2022

Distilled Neural Networks for Efficient Learning to Rank

Recent studies in Learning to Rank have shown the possibility to effecti...

Please sign up or login with your details

Forgot password? Click here to reset