DeepAI AI Chat
Log In Sign Up

Fast Training of Convolutional Neural Networks via Kernel Rescaling

10/12/2016
by   Pedro Porto Buarque de Gusmão, et al.
Politecnico di Torino
Telecom Italia SpA
0

Training deep Convolutional Neural Networks (CNN) is a time consuming task that may take weeks to complete. In this article we propose a novel, theoretically founded method for reducing CNN training time without incurring any loss in accuracy. The basic idea is to begin training with a pre-train network using lower-resolution kernels and input images, and then refine the results at the full resolution by exploiting the spatial scaling property of convolutions. We apply our method to the ImageNet winner OverFeat and to the more recent ResNet architecture and show a reduction in training time of nearly 20

READ FULL TEXT

page 1

page 2

page 3

page 4

06/24/2017

Irregular Convolutional Neural Networks

Convolutional kernels are basic and vital components of deep Convolution...
02/10/2021

Two Novel Performance Improvements for Evolving CNN Topologies

Convolutional Neural Networks (CNNs) are the state-of-the-art algorithms...
08/28/2019

Distributed Deep Learning for Precipitation Nowcasting

Effective training of Deep Neural Networks requires massive amounts of d...
12/07/2019

Comparison of Neuronal Attention Models

Recent models for image processing are using the Convolutional neural ne...
03/27/2018

Incremental Training of Deep Convolutional Neural Networks

We propose an incremental training method that partitions the original n...
06/03/2015

Implementation of Training Convolutional Neural Networks

Deep learning refers to the shining branch of machine learning that is b...
10/28/2021

SIMCNN – Exploiting Computational Similarity to Accelerate CNN Training in Hardware

Convolution neural networks (CNN) are computation intensive to train. It...