Not All Lotteries Are Made Equal

06/16/2022
by   Surya Kant Sahu, et al.
0

The Lottery Ticket Hypothesis (LTH) states that for a reasonably sized neural network, a sub-network within the same network yields no less performance than the dense counterpart when trained from the same initialization. This work investigates the relation between model size and the ease of finding these sparse sub-networks. We show through experiments that, surprisingly, under a finite budget, smaller models benefit more from Ticket Search (TS).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/08/2021

Good Students Play Big Lottery Better

Lottery ticket hypothesis suggests that a dense neural network contains ...
research
05/19/2019

Sparse Transfer Learning via Winning Lottery Tickets

The recently proposed Lottery Ticket Hypothesis of Frankle and Carbin (2...
research
06/13/2021

Towards Understanding Iterative Magnitude Pruning: Why Lottery Tickets Win

The lottery ticket hypothesis states that sparse subnetworks exist in ra...
research
12/13/2021

On the Compression of Natural Language Models

Deep neural networks are effective feature extractors but they are prohi...
research
08/02/2019

Network with Sub-Networks

We introduce network with sub-network, a neural network which their weig...
research
12/10/2019

Winning the Lottery with Continuous Sparsification

The Lottery Ticket Hypothesis from Frankle Carbin (2019) conjectures...
research
06/12/2020

How many winning tickets are there in one DNN?

The recent lottery ticket hypothesis proposes that there is one sub-netw...

Please sign up or login with your details

Forgot password? Click here to reset