Stochastic Linear Bandits with Hidden Low Rank Structure

01/28/2019
by   Sahin Lale, et al.
18

High-dimensional representations often have a lower dimensional underlying structure. This is particularly the case in many decision making settings. For example, when the representation of actions is generated from a deep neural network, it is reasonable to expect a low-rank structure whereas conventional structures like sparsity are not valid anymore. Subspace recovery methods, such as Principle Component Analysis (PCA) can find the underlying low-rank structures in the feature space and reduce the complexity of the learning tasks. In this work, we propose Projected Stochastic Linear Bandit (PSLB), an algorithm for high dimensional stochastic linear bandits (SLB) when the representation of actions has an underlying low-dimensional subspace structure. PSLB deploys PCA based projection to iteratively find the low rank structure in SLBs. We show that deploying projection methods assures dimensionality reduction and results in a tighter regret upper bound that is in terms of the dimensionality of the subspace and its properties, rather than the dimensionality of the ambient space. We modify the image classification task into the SLB setting and empirically show that, when a pre-trained DNN provides the high dimensional feature representations, deploying PSLB results in significant reduction of regret and faster convergence to an accurate model compared to state-of-art algorithm.

READ FULL TEXT

page 21

page 22

page 23

page 24

page 25

page 26

page 27

research
05/06/2023

On High-dimensional and Low-rank Tensor Bandits

Most existing studies on linear bandits focus on the one-dimensional cha...
research
07/17/2017

Non-Linear Subspace Clustering with Learned Low-Rank Kernels

In this paper, we present a kernel subspace clustering method that can h...
research
01/08/2019

Bilinear Bandits with Low-rank Structure

We introduce the bilinear bandit problem with low-rank structure where a...
research
06/18/2020

Precise expressions for random projections: Low-rank approximation and randomized Newton

It is often desirable to reduce the dimensionality of a large dataset by...
research
05/06/2020

Low-Rank Nonlinear Decoding of μ-ECoG from the Primary Auditory Cortex

This paper considers the problem of neural decoding from parallel neural...
research
03/06/2021

Low-Rank Isomap Algorithm

The Isomap is a well-known nonlinear dimensionality reduction method tha...
research
09/05/2017

Linear Optimal Low Rank Projection for High-Dimensional Multi-Class Data

Classification of individual samples into one or more categories is crit...

Please sign up or login with your details

Forgot password? Click here to reset