The Untapped Potential of Off-the-Shelf Convolutional Neural Networks

03/17/2021
by   Matthew Inkawhich, et al.
13

Over recent years, a myriad of novel convolutional network architectures have been developed to advance state-of-the-art performance on challenging recognition tasks. As computational resources improve, a great deal of effort has been placed in efficiently scaling up existing designs and generating new architectures with Neural Architecture Search (NAS) algorithms. While network topology has proven to be a critical factor for model performance, we show that significant gains are being left on the table by keeping topology static at inference-time. Due to challenges such as scale variation, we should not expect static models configured to perform well across a training dataset to be optimally configured to handle all test data. In this work, we seek to expose the exciting potential of inference-time-dynamic models. By allowing just four layers to dynamically change configuration at inference-time, we show that existing off-the-shelf models like ResNet-50 are capable of over 95 on ImageNet. This level of performance currently exceeds that of models with over 20x more parameters and significantly more complex training procedures.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset