Sanity Checks for Lottery Tickets: Does Your Winning Ticket Really Win the Jackpot?

by   Xiaolong Ma, et al.

There have been long-standing controversies and inconsistencies over the experiment setup and criteria for identifying the "winning ticket" in literature. To reconcile such, we revisit the definition of lottery ticket hypothesis, with comprehensive and more rigorous conditions. Under our new definition, we show concrete evidence to clarify whether the winning ticket exists across the major DNN architectures and/or applications. Through extensive experiments, we perform quantitative analysis on the correlations between winning tickets and various experimental factors, and empirically study the patterns of our observations. We find that the key training hyperparameters, such as learning rate and training epochs, as well as the architecture characteristics such as capacities and residual connections, are all highly correlated with whether and when the winning tickets can be identified. Based on our analysis, we summarize a guideline for parameter settings in regards of specific architecture characteristics, which we hope to catalyze the research progress on the topic of lottery ticket hypothesis.


page 3

page 10

page 13

page 14

page 18

page 20

page 23

page 24


Quantitative Evaluations on Saliency Methods: An Experimental Study

It has been long debated that eXplainable AI (XAI) is an important topic...

Towards Practical Lottery Ticket Hypothesis for Adversarial Training

Recent research has proposed the lottery ticket hypothesis, suggesting t...

p-Value as the Strength of Evidence Measured by Confidence Distribution

The notion of p-value is a fundamental concept in statistical inference ...

Demystifying Learning Rate Polices for High Accuracy Training of Deep Neural Networks

Learning Rate (LR) is an important hyper-parameter to tune for effective...

Lottery Ticket Implies Accuracy Degradation, Is It a Desirable Phenomenon?

In deep model compression, the recent finding "Lottery Ticket Hypothesis...

Residual Connections Encourage Iterative Inference

Residual networks (Resnets) have become a prominent architecture in deep...

DeepCore: A Comprehensive Library for Coreset Selection in Deep Learning

Coreset selection, which aims to select a subset of the most informative...