DeepAI AI Chat
Log In Sign Up

Improving Compositionality of Neural Networks by Decoding Representations to Inputs

by   Mike Wu, et al.

In traditional software programs, we take for granted how easy it is to debug code by tracing program logic from variables back to input, apply unit tests and assertion statements to block erroneous behavior, and compose programs together. But as the programs we write grow more complex, it becomes hard to apply traditional software to applications like computer vision or natural language. Although deep learning programs have demonstrated strong performance on these applications, they sacrifice many of the functionalities of traditional software programs. In this paper, we work towards bridging the benefits of traditional and deep learning programs by jointly training a generative model to constrain neural network activations to "decode" back to inputs. Doing so enables practitioners to probe and track information encoded in activation(s), apply assertion-like constraints on what information is encoded in an activation, and compose separate neural networks together in a plug-and-play fashion. In our experiments, we demonstrate applications of decodable representations to out-of-distribution detection, adversarial examples, calibration, and fairness – while matching standard neural networks in accuracy.


page 1

page 4

page 8

page 15


Multimodal Deep Learning for Flaw Detection in Software Programs

We explore the use of multiple deep learning models for detecting flaws ...

Towards Improved Testing For Deep Learning

The growing use of deep neural networks in safety-critical applications ...

Test-Case Generation for Finding Neural Network Bugs

As neural networks are increasingly included as core components of safet...

Stochastic Activation Pruning for Robust Adversarial Defense

Neural networks are known to be vulnerable to adversarial examples. Care...

Live Trojan Attacks on Deep Neural Networks

Like all software systems, the execution of deep learning models is dict...

New Evolutionary Computation Models and their Applications to Machine Learning

Automatic Programming is one of the most important areas of computer sci...

Deep Neural Programs for Adaptive Control in Cyber-Physical Systems

We introduce Deep Neural Programs (DNP), a novel programming paradigm fo...