Kernel computations from large-scale random features obtained by Optical Processing Units

10/22/2019
by   Ruben Ohana, et al.
0

Approximating kernel functions with random features (RFs)has been a successful application of random projections for nonparametric estimation. However, performing random projections presents computational challenges for large-scale problems. Recently, a new optical hardware called Optical Processing Unit (OPU) has been developed for fast and energy-efficient computation of large-scale RFs in the analog domain. More specifically, the OPU performs the multiplication of input vectors by a large random matrix with complex-valued i.i.d. Gaussian entries, followed by the application of an element-wise squared absolute value operation - this last nonlinearity being intrinsic to the sensing process. In this paper, we show that this operation results in a dot-product kernel that has connections to the polynomial kernel, and we extend this computation to arbitrary powers of the feature map. Experiments demonstrate that the OPU kernel and its RF approximation achieve competitive performance in applications using kernel ridge regression and transfer learning for image classification. Crucially, thanks to the use of the OPU, these results are obtained with time and energy savings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2023

Mercer Large-Scale Kernel Machines from Ridge Function Perspective

To present Mercer large-scale kernel machines from a ridge function pers...
research
03/21/2020

Scaling up Kernel Ridge Regression via Locality Sensitive Hashing

Random binning features, introduced in the seminal paper of Rahimi and R...
research
10/16/2020

Fast Graph Kernel with Optical Random Features

The graphlet kernel is a classical method in graph classification. It ho...
research
12/07/2020

Randomized kernels for large scale Earth observation applications

Dealing with land cover classification of the new image sources has also...
research
04/27/2015

Sign Stable Random Projections for Large-Scale Learning

We study the use of "sign α-stable random projections" (where 0<α≤ 2) fo...
research
10/21/2022

Efficient Dataset Distillation Using Random Feature Approximation

Dataset distillation compresses large datasets into smaller synthetic co...
research
03/15/2023

Physics-Informed Optical Kernel Regression Using Complex-valued Neural Fields

Lithography is fundamental to integrated circuit fabrication, necessitat...

Please sign up or login with your details

Forgot password? Click here to reset