Modularity Trumps Invariance for Compositional Robustness

06/15/2023
by   Ian Mason, et al.
0

By default neural networks are not robust to changes in data distribution. This has been demonstrated with simple image corruptions, such as blurring or adding noise, degrading image classification performance. Many methods have been proposed to mitigate these issues but for the most part models are evaluated on single corruptions. In reality, visual space is compositional in nature, that is, that as well as robustness to elemental corruptions, robustness to compositions of corruptions is also needed. In this work we develop a compositional image classification task where, given a few elemental corruptions, models are asked to generalize to compositions of these corruptions. That is, to achieve compositional robustness. We experimentally compare empirical risk minimization with an invariance building pairwise contrastive loss and, counter to common intuitions in domain generalization, achieve only marginal improvements in compositional robustness by encouraging invariance. To move beyond invariance, following previously proposed inductive biases that model architectures should reflect data structure, we introduce a modular architecture whose structure replicates the compositional nature of the task. We then show that this modular approach consistently achieves better compositional robustness than non-modular approaches. We additionally find empirical evidence that the degree of invariance between representations of 'in-distribution' elemental corruptions fails to correlate with robustness to 'out-of-distribution' compositions of corruptions.

READ FULL TEXT

page 2

page 19

page 20

page 21

page 22

page 37

page 41

page 42

research
06/02/2023

Independent Modular Networks

Monolithic neural networks that make use of a single set of weights to l...
research
06/19/2021

Improving Compositional Generalization in Classification Tasks via Structure Annotations

Compositional generalization is the ability to generalize systematically...
research
06/16/2020

A Study of Compositional Generalization in Neural Models

Compositional and relational learning is a hallmark of human intelligenc...
research
11/11/2021

Catalytic Role Of Noise And Necessity Of Inductive Biases In The Emergence Of Compositional Communication

Communication is compositional if complex signals can be represented as ...
research
03/06/2021

A Framework for Measuring Compositional Inductive Bias

We present a framework for measuring the compositional inductive bias of...
research
11/02/2017

A Denotational Semantics for SPARC TSO

The SPARC TSO weak memory model is defined axiomatically, with a non-com...
research
11/16/2021

Learning Augmentation Distributions using Transformed Risk Minimization

Adapting to the structure of data distributions (such as symmetry and tr...

Please sign up or login with your details

Forgot password? Click here to reset