Any Deep ReLU Network is Shallow

06/20/2023
by   Mattia Jacopo Villani, et al.
1

We constructively prove that every deep ReLU network can be rewritten as a functionally identical three-layer network with weights valued in the extended reals. Based on this proof, we provide an algorithm that, given a deep ReLU network, finds the explicit weights of the corresponding shallow network. The resulting shallow network is transparent and used to generate explanations of the model s behaviour.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset