Asymptotically Optimal Massey-Like Inequality on Guessing Entropy With Application to Side-Channel Attack Evaluations

03/29/2021
by   Andrei Tanasescu, et al.
0

A Massey-like inequality is any useful lower bound on guessing entropy in terms of the computationally scalable Shannon entropy. The asymptotically optimal Massey-like inequality is determined and further refined for finite-support distributions. The impact of these results are highlighted for side-channel attack evaluation where guessing entropy is a key metric. In this context, the obtained bounds are compared to the state of the art.

READ FULL TEXT
research
07/10/2019

Entropy and Compression: A simple proof of an inequality of Khinchin

We prove that Entropy is a lower bound for the average compression ratio...
research
01/30/2019

Transportation Proof of an inequality by Anantharam, Jog and Nair

Anantharam, Jog and Nair recently put forth an entropic inequality which...
research
06/29/2023

Tokenization and the Noiseless Channel

Subword tokenization is a key part of many NLP pipelines. However, littl...
research
07/22/2018

On the influence function for the Theil-like class of inequality measures

On one hand, a large class of inequality measures, which includes the ge...
research
08/30/2022

On some properties of medians, percentiles, baselines, and thresholds in empirical bibliometric analysis

One of the most useful and correct methodological approaches in bibliome...
research
05/29/2018

Error Bounds on a Mixed Entropy Inequality

Motivated by the entropy computations relevant to the evaluation of decr...
research
10/20/2020

A New Rearrangement Inequality and Its Application to Index Assignment

This paper investigates the rearrangement problem whose objective is to ...

Please sign up or login with your details

Forgot password? Click here to reset