Noisy Learning for Neural ODEs Acts as a Robustness Locus Widening

06/16/2022
by   Martin Gonzalez, et al.
9

We investigate the problems and challenges of evaluating the robustness of Differential Equation-based (DE) networks against synthetic distribution shifts. We propose a novel and simple accuracy metric which can be used to evaluate intrinsic robustness and to validate dataset corruption simulators. We also propose methodology recommendations, destined for evaluating the many faces of neural DEs' robustness and for comparing them with their discrete counterparts rigorously. We then use this criteria to evaluate a cheap data augmentation technique as a reliable way for demonstrating the natural robustness of neural ODEs against simulated image corruptions across multiple datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2020

The Many Faces of Robustness: A Critical Analysis of Out-of-Distribution Generalization

We introduce three new robustness benchmarks consisting of naturally occ...
research
12/09/2022

AugNet: Dynamic Test-Time Augmentation via Differentiable Functions

Distribution shifts, which often occur in the real world, degrade the ac...
research
06/30/2022

Exposing and addressing the fragility of neural networks in digital pathology

Neural networks have achieved impressive results in many medical imaging...
research
06/05/2022

AugLoss: A Learning Methodology for Real-World Dataset Corruption

Deep Learning (DL) models achieve great successes in many domains. Howev...
research
10/15/2022

Analyzing the Robustness of PECNet

Comprehensive robustness analysis of PECNet, a pedestrian trajectory pre...
research
01/17/2022

AugLy: Data Augmentations for Robustness

We introduce AugLy, a data augmentation library with a focus on adversar...
research
12/30/2021

Towards Robustness of Neural Networks

We introduce several new datasets namely ImageNet-A/O and ImageNet-R as ...

Please sign up or login with your details

Forgot password? Click here to reset