On Size-Independent Sample Complexity of ReLU Networks

06/03/2023
by   Mark Sellke, et al.
0

We study the sample complexity of learning ReLU neural networks from the point of view of generalization. Given norm constraints on the weight matrices, a common approach is to estimate the Rademacher complexity of the associated function class. Previously Golowich-Rakhlin-Shamir (2020) obtained a bound independent of the network size (scaling with a product of Frobenius norms) except for a factor of the square-root depth. We give a refinement which often has no explicit depth-dependence at all.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset