Incorrect by Construction: Fine Tuning Neural Networks for Guaranteed Performance on Finite Sets of Examples

08/03/2020
by   Ivan Papusha, et al.
0

There is great interest in using formal methods to guarantee the reliability of deep neural networks. However, these techniques may also be used to implant carefully selected input-output pairs. We present initial results on a novel technique for using SMT solvers to fine tune the weights of a ReLU neural network to guarantee outcomes on a finite set of particular examples. This procedure can be used to ensure performance on key examples, but it could also be used to insert difficult-to-find incorrect examples that trigger unexpected performance. We demonstrate this approach by fine tuning an MNIST network to incorrectly classify a particular image and discuss the potential for the approach to compromise reliability of freely-shared machine learning models.

READ FULL TEXT
research
11/28/2017

Gradual Tuning: a better way of Fine Tuning the parameters of a Deep Neural Network

In this paper we present an alternative strategy for fine-tuning the par...
research
12/01/2020

How to fine-tune deep neural networks in few-shot learning?

Deep learning has been widely used in data-intensive applications. Howev...
research
02/19/2020

Distance-Based Regularisation of Deep Networks for Fine-Tuning

We investigate approaches to regularisation during fine-tuning of deep n...
research
04/08/2016

CNN Image Retrieval Learns from BoW: Unsupervised Fine-Tuning with Hard Examples

Convolutional Neural Networks (CNNs) achieve state-of-the-art performanc...
research
07/07/2020

Physics-Based Deep Neural Networks for Beam Dynamics in Charged Particle Accelerators

This paper presents a novel approach for constructing neural networks wh...
research
06/09/2010

Using Neural Networks to improve classical Operating System Fingerprinting techniques

We present remote Operating System detection as an inference problem: gi...
research
01/29/2020

Safe Predictors for Enforcing Input-Output Specifications

We present an approach for designing correct-by-construction neural netw...

Please sign up or login with your details

Forgot password? Click here to reset