Using activation histograms to bound the number of affine regions in ReLU feed-forward neural networks

03/31/2021
by   Peter Hinz, et al.
18

Several current bounds on the maximal number of affine regions of a ReLU feed-forward neural network are special cases of the framework [1] which relies on layer-wise activation histogram bounds. We analyze and partially solve a problem in algebraic topology the solution of which would fully exploit this framework. Our partial solution already induces slightly tighter bounds and suggests insight in how parameter initialization methods can affect the number of regions. Furthermore, we extend the framework to allow the composition of subnetwork instead of layer-wise activation histogram bounds to reduce the number of required compositions which negatively affect the tightness of the resulting bound.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset