Identifying Weights and Architectures of Unknown ReLU Networks

10/02/2019
by   David Rolnick, et al.
0

The output of a neural network depends on its parameters in a highly nonlinear way, and it is widely assumed that a network's parameters cannot be identified from its outputs. Here, we show that in many cases it is possible to reconstruct the architecture, weights, and biases of a deep ReLU network given the ability to query the network. ReLU networks are piecewise linear and the boundaries between pieces correspond to inputs for which one of the ReLUs switches between inactive and active states. Thus, first-layer ReLUs can be identified (up to sign and scaling) based on the orientation of their associated hyperplanes. Later-layer ReLU boundaries bend when they cross earlier-layer boundaries and the extent of bending reveals the weights between them. Our algorithm uses this to identify the units in the network and weights connecting them (up to isomorphism). The fact that considerable parts of deep networks can be identified from their outputs has implications for security, neuroscience, and our understanding of neural networks.

READ FULL TEXT
research
06/20/2023

Any Deep ReLU Network is Shallow

We constructively prove that every deep ReLU network can be rewritten as...
research
10/02/2018

GINN: Geometric Illustration of Neural Networks

This informal technical report details the geometric illustration of dec...
research
12/24/2021

Parameter identifiability of a deep feedforward ReLU neural network

The possibility for one to recover the parameters-weights and biases-of ...
research
03/12/2018

R3Net: Random Weights, Rectifier Linear Units and Robustness for Artificial Neural Network

We consider a neural network architecture with randomized features, a si...
research
11/08/2017

Lower bounds over Boolean inputs for deep neural networks with ReLU gates

Motivated by the resurgence of neural networks in being able to solve co...
research
06/01/2022

Rotate the ReLU to implicitly sparsify deep networks

In the era of Deep Neural Network based solutions for a variety of real-...
research
12/27/2021

Depth and Feature Learning are Provably Beneficial for Neural Network Discriminators

We construct pairs of distributions μ_d, ν_d on ℝ^d such that the quanti...

Please sign up or login with your details

Forgot password? Click here to reset