Deviation bound for non-causal machine learning

09/18/2020
by   Rémy Garnier, et al.
0

Concentration inequality are widely used for analysing machines learning algorithms. However, current concentration inequalities cannot be applied to the most popular deep neural network, notably in NLP processing. This is mostly due to the non-causal nature of this data. In this paper, a framework for modelling non-causal random fields is provided. A McDiarmid-type concentration inequality is obtained for this framework. In order to do so, we introduce a local i.i.d approximation of the non-causal random field.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2022

Deviation inequality for Banach-valued orthomartingales

We show a deviation inequality inequalities for multi-indexed martingale...
research
08/12/2020

A matrix concentration inequality for products

We present a non-asymptotic concentration inequality for the random matr...
research
05/09/2018

Concentration inequalities for randomly permuted sums

Initially motivated by the study of the non-asymptotic properties of non...
research
03/23/2022

Variations and extensions of the Gaussian concentration inequality, Part II

Pisier's version of the Gaussian concentration inequality is transformed...
research
06/27/2021

Concentration of Contractive Stochastic Approximation and Reinforcement Learning

Using a martingale concentration inequality, concentration bounds `from ...
research
02/24/2021

Simplified quasi-likelihood analysis for a locally asymptotically quadratic random field

The asymptotic decision theory by Le Cam and Hajek has been given a luci...
research
04/23/2021

A Framework for Recognizing and Estimating Human Concentration Levels

One of the major tasks in online education is to estimate the concentrat...

Please sign up or login with your details

Forgot password? Click here to reset