Convex quantization preserves logconcavity

06/11/2022
by   Pol del Aguila Pla, et al.
0

Much like convexity is key to variational optimization, a logconcave distribution is key to amenable statistical inference. Quantization is often disregarded when writing likelihood models: ignoring the limitations of physical detectors. This begs the questions: would including quantization preclude logconcavity, and, are the true data likelihoods logconcave? We show that the same simple assumption that leads to logconcave continuous data likelihoods also leads to logconcave quantized data likelihoods, provided that convex quantization regions are used.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2021

In-Hindsight Quantization Range Estimation for Quantized Training

Quantization techniques applied to the inference of deep neural networks...
research
03/14/2022

Semi-Discrete Normalizing Flows through Differentiable Tessellation

Mapping between discrete and continuous distributions is a difficult tas...
research
12/05/2022

QFT: Post-training quantization via fast joint finetuning of all degrees of freedom

The post-training quantization (PTQ) challenge of bringing quantized neu...
research
06/12/2023

NF4 Isn't Information Theoretically Optimal (and that's Good)

This note shares some simple calculations and experiments related to abs...
research
07/28/2021

On Optimal Quantization in Sequential Detection

The problem of designing optimal quantization rules for sequential detec...
research
07/15/2020

Image De-Quantization Using Generative Models as Priors

Image quantization is used in several applications aiming in reducing th...
research
12/15/2022

Huber-energy measure quantization

We describe a measure quantization procedure i.e., an algorithm which fi...

Please sign up or login with your details

Forgot password? Click here to reset