Learning to Learn Parameterized Classification Networks for Scalable Input Images

07/13/2020
by   Duo Li, et al.
0

Convolutional Neural Networks (CNNs) do not have a predictable recognition behavior with respect to the input resolution change. This prevents the feasibility of deployment on different input image resolutions for a specific model. To achieve efficient and flexible image classification at runtime, we employ meta learners to generate convolutional weights of main networks for various input scales and maintain privatized Batch Normalization layers per scale. For improved training performance, we further utilize knowledge distillation on the fly over model predictions based on different input resolutions. The learned meta network could dynamically parameterize main networks to act on input images of arbitrary size with consistently better accuracy compared to individually trained models. Extensive experiments on the ImageNet demonstrate that our method achieves an improved accuracy-efficiency trade-off during the adaptive inference process. By switching executable input resolutions, our method could satisfy the requirement of fast adaption in different resource-constrained environments. Code and models are available at https://github.com/d-li14/SAN.

READ FULL TEXT
research
07/19/2020

Resolution Switchable Networks for Runtime Efficient Image Recognition

We propose a general method to train a single convolutional neural netwo...
research
07/14/2022

Dynamic Low-Resolution Distillation for Cost-Efficient End-to-End Text Spotting

End-to-end text spotting has attached great attention recently due to it...
research
12/01/2022

ResFormer: Scaling ViTs with Multi-Resolution Training

Vision Transformers (ViTs) have achieved overwhelming success, yet they ...
research
03/30/2021

Automated Cleanup of the ImageNet Dataset by Model Consensus, Explainability and Confident Learning

The convolutional neural networks (CNNs) trained on ILSVRC12 ImageNet we...
research
03/12/2019

Universally Slimmable Networks and Improved Training Techniques

Slimmable networks are a family of neural networks that can instantly ad...
research
06/22/2023

Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective

We present a new dataset condensation framework termed Squeeze, Recover ...
research
02/13/2023

Stitchable Neural Networks

The public model zoo containing enormous powerful pretrained model famil...

Please sign up or login with your details

Forgot password? Click here to reset