Robust and Active Learning for Deep Neural Network Regression

07/28/2021
by   Xi Li, et al.
0

We describe a gradient-based method to discover local error maximizers of a deep neural network (DNN) used for regression, assuming the availability of an "oracle" capable of providing real-valued supervision (a regression target) for samples. For example, the oracle could be a numerical solver which, operationally, is much slower than the DNN. Given a discovered set of local error maximizers, the DNN is either fine-tuned or retrained in the manner of active learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2021

Backdoor Attack and Defense for Deep Regression

We demonstrate a backdoor attack on a deep neural network used for regre...
research
01/18/2020

Active Learning over DNN: Automated Engineering Design Optimization for Fluid Dynamics Based on Self-Simulated Dataset

Optimizing fluid-dynamic performance is an important engineering task. T...
research
02/11/2019

Fast Evaluation of Low-Thrust Transfers via Deep Neural Networks

The design of low-thrust-based multitarget interplanetary missions requi...
research
11/08/2020

Bait and Switch: Online Training Data Poisoning of Autonomous Driving Systems

We show that by controlling parts of a physical environment in which a p...
research
06/21/2021

Active Learning for Deep Neural Networks on Edge Devices

When dealing with deep neural network (DNN) applications on edge devices...
research
04/13/2020

Technical Report: NEMO DNN Quantization for Deployment Model

This technical report aims at defining a formal framework for Deep Neura...
research
05/25/2021

Optimal Sampling Density for Nonparametric Regression

We propose a novel active learning strategy for regression, which is mod...

Please sign up or login with your details

Forgot password? Click here to reset