Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks

02/26/2021
by   Jinlin Xiang, et al.
0

In recent years, Convolutional Neural Networks (CNNs) have enabled ubiquitous image processing applications. As such, CNNs require fast runtime (forward propagation) to process high-resolution visual streams in real time. This is still a challenging task even with state-of-the-art graphics and tensor processing units. The bottleneck in computational efficiency primarily occurs in the convolutional layers. Performing operations in the Fourier domain is a promising way to accelerate forward propagation since it transforms convolutions into elementwise multiplications, which are considerably faster to compute for large kernels. Furthermore, such computation could be implemented using an optical 4f system with orders of magnitude faster operation. However, a major challenge in using this spectral approach, as well as in an optical implementation of CNNs, is the inclusion of a nonlinearity between each convolutional layer, without which CNN performance drops dramatically. Here, we propose a Spectral CNN Linear Counterpart (SCLC) network architecture and develop a Knowledge Distillation (KD) approach to circumvent the need for a nonlinearity and successfully train such networks. While the KD approach is known in machine learning as an effective process for network pruning, we adapt the approach to transfer the knowledge from a nonlinear network (teacher) to a linear counterpart (student). We show that the KD approach can achieve performance that easily surpasses the standard linear version of a CNN and could approach the performance of the nonlinear network. Our simulations show that the possibility of increasing the resolution of the input image allows our proposed 4f optical linear network to perform more efficiently than a nonlinear network with the same accuracy on two fundamental image processing tasks: (i) object classification and (ii) semantic segmentation.

READ FULL TEXT

page 3

page 11

research
10/18/2022

On effects of Knowledge Distillation on Transfer Learning

Knowledge distillation is a popular machine learning technique that aims...
research
02/09/2023

Toward Extremely Lightweight Distracted Driver Recognition With Distillation-Based Neural Architecture Search and Knowledge Transfer

The number of traffic accidents has been continuously increasing in rece...
research
05/20/2018

Wavelet Convolutional Neural Networks

Spatial and spectral approaches are two major approaches for image proce...
research
09/28/2019

Training convolutional neural networks with cheap convolutions and online distillation

The large memory and computation consumption in convolutional neural net...
research
07/15/2020

Learning with Privileged Information for Efficient Image Super-Resolution

Convolutional neural networks (CNNs) have allowed remarkable advances in...
research
11/14/2020

Channel Tiling for Improved Performance and Accuracy of Optical Neural Network Accelerators

Low latency, high throughput inference on Convolution Neural Networks (C...
research
11/14/2020

11 TeraFLOPs per second photonic convolutional accelerator for deep learning optical neural networks

Convolutional neural networks (CNNs), inspired by biological visual cort...

Please sign up or login with your details

Forgot password? Click here to reset