Meta-Learning Sparse Compression Networks

Recent work in Deep Learning has re-imagined the representation of data as functions mapping from a coordinate space to an underlying continuous signal. When such functions are approximated by neural networks this introduces a compelling alternative to the more common multi-dimensional array representation. Recent work on such Implicit Neural Representations (INRs) has shown that - following careful architecture search - INRs can outperform established compression methods such as JPEG (e.g. Dupont et al., 2021). In this paper, we propose crucial steps towards making such ideas scalable: Firstly, we employ stateof-the-art network sparsification techniques to drastically improve compression. Secondly, introduce the first method allowing for sparsification to be employed in the inner-loop of commonly used Meta-Learning algorithms, drastically improving both compression and the computational cost of learning INRs. The generality of this formalism allows us to present results on diverse data modalities such as images, manifolds, signed distance functions, 3D shapes and scenes, several of which establish new state-of-the-art results.

READ FULL TEXT

page 8

page 10

page 16

page 17

page 18

research
10/27/2021

Meta-Learning Sparse Implicit Neural Representations

Implicit neural representations are a promising new avenue of representi...
research
10/12/2022

Evaluated CMI Bounds for Meta Learning: Tightness and Expressiveness

Recent work has established that the conditional mutual information (CMI...
research
07/31/2020

L^2C – Learning to Learn to Compress

In this paper we present an end-to-end meta-learned system for image com...
research
03/09/2022

What Matters For Meta-Learning Vision Regression Tasks?

Meta-learning is widely used in few-shot classification and function reg...
research
05/12/2021

Exploring the Similarity of Representations in Model-Agnostic Meta-Learning

In past years model-agnostic meta-learning (MAML) has been one of the mo...
research
12/03/2020

Learned Initializations for Optimizing Coordinate-Based Neural Representations

Coordinate-based neural representations have shown significant promise a...

Please sign up or login with your details

Forgot password? Click here to reset