Bi-log-concavity: some properties and some remarks towards a multi-dimensional extension

03/18/2019
by   Adrien Saumard, et al.
0

Bi-log-concavity of probability measures is a univariate extension of the notion of log-concavity that has been recently proposed in a statistical literature. Among other things, it has the nice property from a modelisation perspective to admit some multimodal distributions, while preserving some nice features of log-concave measures. We compute the isoperimetric constant for a bi-log-concave measure, extending a property available for log-concave measures. This implies that bi-log-concave measures have exponentially decreasing tails. Then we show that the convolution of a bi-log-concave measure with a log-concave one is bi-log-concave. Consequently, infinitely differentiable, positive densities are dense in the set of bi-log-concave densities for L_p-norms, p ∈ [1;+∞]. We also derive a necessary and sufficient condition for the convolution of two bi-log-concave measures to be bi-log-concave. We conclude this note by discussing ways of defining a multi-dimensional extension of the notion of bi-log-concavity. We propose an approach based on a variant of the isoperimetric problem, restricted to half-spaces.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

06/06/2020

Bi-s^*-Concave Distributions

We introduce a new shape-constrained class of distribution functions on ...
03/25/2019

Inequalities between L^p-norms for log-concave distributions

Log-concave distributions include some important distributions such as n...
02/18/2021

Convolution of a symmetric log-concave distribution and a symmetric bimodal distribution can have any number of modes

In this note, we show that the convolution of a discrete symmetric log-c...
01/27/2021

Functional inequalities for perturbed measures with applications to log-concave measures and to some Bayesian problems

We study functional inequalities (Poincaré, Cheeger, log-Sobolev) for pr...
06/16/2019

Global Convergence of Least Squares EM for Demixing Two Log-Concave Densities

This work studies the location estimation problem for a mixture of two r...
05/17/2021

On log-concave approximations of high-dimensional posterior measures and stability properties in non-linear inverse problems

The problem of efficiently generating random samples from high-dimension...
10/08/2020

On the cost of Bayesian posterior mean strategy for log-concave models

In this paper, we investigate the problem of computing Bayesian estimato...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Bi-log-concavity (of a probability measure on the real line) is a property recently introduced by Dümbgen, Kolesnyk and Wilke ([DKW17]), that aims at bypassing some restrictive aspects of log-concavity while preserving some of its nice features. More precisely, bi-log-concavity amounts to log-concavity of both and and a simple application of Prékopa’s theorem on stability of log-concavity through marginalization ([Pré73], see also [SW14] for a discussion on the various proofs of this fundamental theorem) shows that log-concave measures are also bi-log-concave (see [BB05] for a more direct, elementary proof of this latter fact).

From a modelisation perspective, bi-log-concavity and log-concavity may be seen as shape constraints. In statistics, when they are available, shape constraints represent an interesting alternative to more classical parametric, semi-parametric or non-parametric approaches and constitute an active contemporary line of research ([Wal09, Sam18]). Bi-log-concavity was indeed proposed in the aim to contribute to this research area ([DKW17]). It was used in [DKW17]

to construct efficient confidence bands for the cumulative distribution function and some functionals of it. The authors highlight that bi-log-concave measures admit multi-modal measures while it is well-known that log-concave measures are unimodal. Furthermore, Dümbgen et al.

[DKW17] establish the following characterization of bi-log-concave distributions. For a distribution function , denote

and call “non-degenerate”, the functions such that .

Theorem 1 (Characterization of bi-log-concavity, [Dkw17])

Let be a non-degenerate distribution function. The following four statements are equivalent:

(i)

is bi-log-concave, i.e. and are log-concave functions in the sense that their logarithm is concave.

(ii)

is continuous on and differentiable on with derivative such that, for all and ,

(iii)

is continuous on and differentiable on with derivative such that the hazard function is non-decreasing and reverse hazard function is non-increasing on .

(iv)

is continuous on and differentiable on with bounded and strictly positive derivative . Furthermore, is locally Lipschitz continuous on with derivative satisfying

Note that if one includes degenerate measures - that is Dirac masses - it is easily seen that the set of bi-log-concave measures is closed under weak limits.

Just as -concave measures generalize log-concave ones, Laha and Wellner [LW17] proposed the concept of bi--concavity, that generalize bi-log-concavity and that include -concave densities. Some characterizations of bi--concavity, that extend the previous theorem, are derived in [LW17].

On the probabilistic side, even if some characterizations are available, many important questions remain about the properties of bi-log-concave measures. Indeed, log-concave measures satisfy many nice properties (see for instance [Gué12, SW14, Col17] and references therein) and it is natural to ask whether some of those are extended to bi-log-concave measures. Answering this question is the primary object of this note.

We show in Section 2 that the isoperimetric constant of a bi-log-concave measure is simply equal to two times the value of its density with respect to the Lebesgue measure - that indeed exists - at its median, thus extending a property available for log-concave measures. We deduce that a bi-log-concave measure has exponential tails, also extending a property valid in the log-concave case.

In Section 3, we show that the convolution of a log-concave measure and a bi-log-concave measure is bi-log-concave. As a consequence, we get that any bi-log-concave measure can be approximated by a sequence of bi-log-concave measures having regular densities. Furthermore, we give a necessary and sufficient condition for the convolution of two bi-log-concave measures to be bi-log-concave.

Finally, we discuss in Section 3.1 possible ways to obtain a multivariate notion of bi-log-concavity. This problem is not a priori obvious, because the definition of bi-log-concavity in one dimension relies on the cumulative distribution function and so, on the total order existing on real numbers. To this end, we derive a characterization of (symmetric) bi-log-concave measures on

through their isoperimetric profile. Then we propose a multidimensional generalization for symmetric measures by considering their isoperimetric profile, restricted to half spaces. We conclude by discussing a way to strengthen the latter definition in order to ensure stability through convolution by any log-concave measure. The question of providing a nice definition of bi-log-concavity in higher dimension, that would also impose existence of some exponential moments, remains open.

2 Isoperimetry and concentration for bi-log-concave measures

Let be the distribution function of a probability measure on the real line. Assume that is non-degenerate (in the sense of its distribution function being non-degenerate) and let be the density of its absolutely continuous part.

Recall the following formula for the isoperimetric constant of , due to Bobkov and Houdré [BH97],

The following theorem extends a well-known fact related to isoperimetric constant for log-concave measure to the case of bi-log-concave measures.

Theorem 2

Let be a probability measure with non-degenerate distribution function being bi-log-concave. Then admits a density on and it holds

where is the median of .

In general, the isoperimetric constant is hard to compute, but in the bi-log-concave case Theorem 2 provides a straightforward formula, that extends a formula valid for log-concave measures (see for instance [SW14]).

In the following, we will also use the notation .

Proof. Note that the median is indeed unique by Theorem 1 above. For ,

As is bi-log-concave, is thus non-increasing on . For ,

Thus, is non-decreasing on . Consequently, the maximum of is attained on and its value is .   

Corollary 3

Let as above be a bi-log-concave measure with median . Then and satisfies the following Poincaré inequality: for any square integrable function with derivative ,

(1)

where

is the variance of

with respect to . Consequently, has bounded Orlicz norm and achieves the following exponential concentration inequality,

(2)

where is the concentration function of , defined by , where and is the (open) neighborhood of .

As it is well-known (see [Led01] for instance), inequality (2) implies that for any Lipschitz function ,

where is a median of , that is and .

Proof. The fact that is given by point (iii) of Theorem 1 above. Then Inequality (1) is a consequence of Theorem 2

via Cheeger’s inequality for the first eigenvalue of the Laplacian (see for instance Inequality 3.1 in

[Led01]). Inequality (2) is a classical consequence of Inequality (1) as well (see Theorem 3.1 in [Led01]).  

Note that, following Bobkov [Bob96], for a log-concave probability measure on having a positive density on , the function is concave. By Theorem 1 above, bi-log-concavity of reduces to non-increasingness of the functions and , which is equivalent to non-increasingness of and . As is concave for a log-measure and as , bi-log-concavity follows. This gives another proof of the fact that log-concave measures are bi-log-concave.

Example 4

The function is in general hard to compute. But a few easy examples exist. For instance, for the logistic distribution, , we have . For the Laplace distribution, , .

3 Stability through convolution

Take and

two independent random variables with respective distribution functions

and that are bi-log-concave. Hence and have densities, denoted by and . Then

(3)

In addition,

(4)
Proposition 5

If is bi-log-concave, is log-concave and is independent from , then is bi-log-concave.

Proof. By using formulas (3) and (4), this is a direct application of Prékopa’s theorem ([Pré73]) on the marginal of a log-concave function.   

Corollary 6

Take a (non-degenerate) bi-log-concave measure on , with density . Then there exists a sequence of infinitely differentiable bi-log-concave densities, positive on , that converge to in , for any

Corollary 6 is also an extension of an approximation result available in the set of log-concave distributions, see [SW14, Section 5.2].

Proof. It suffices to consider the convolution of with a sequence of centered Gaussian densities with variances converging to zero. As has an exponential moment, it belongs to any , Then a simple application of classical theorems about convolution in (see for instance [Rud87, p. 148]) allows to check that the approximations converge to in any ,   

More generally, the following theorem gives a necessary and sufficient condition for the convolution of two bi-log-concave measures to be bi-log-concave.

Theorem 7

Take and two independent bi-log-concave random variables with respective densities and and cumulative distribution functions and . Denote and and consider for any , the following measures on ,

and

Then is bi-log-concave if and only if for any ,

(5)

and

(6)

Of course, a simple symmetrization argument shows that conditions (5) and (6) are satisfied if pointwise, which means that is log-concave, in which case we recover Proposition 5 above. But Theorem 7 is more general. Indeed, it is easily checked by direct computations that the convolution the Gaussian mixture - which is bi-log-concave but not log-concave, see [DKW17, Section 2] - with itself is bi-log-concave.

To prove Theorem 7, we will use the following lemma.

Lemma 8

Take such that and a measure on with density absolutely continuous and . Take Lipschitz continuous such that and

then

Proof of Lemma 8. This a simple integration by parts: from the assumptions, we have

 

Proof of Theorem 7. Recall that we have

Our first goal is to find some conditions such that is log-concave. It is sufficient to prove that, for any ,

or equivalently,

Denote . We have

Furthermore, we get

Now, by Lemma 8, it holds,

Gathering the equations, we get

which gives condition (5). Likewise condition (6) arises from the same type of computations when studying log-concavity of .   

3.1 Towards a multivariate notion of bi-log-concavity

Let us introduce this section with the following remark. The isoperimetric profile is defined as follows: for any ,

where , with . Note that the isoperimetric profile depends on the distance that is considered. Unless explicitly mentioned, we will consider in the following that the distance is the Euclidean distance. From inequality (2.1) in [Bob96], we have for a log-concave measure on and any ,

Hence,

If is moreover symmetric, then for any .

For a general measure , we define the isoperimetric profile restricted to half-spaces : for any ,

For a measure on having a density , one has

since possible half-spaces are in this case or . If is symmetric, then and bi-log-concavity is equivalent to non-increasingness of on . As proved in Bobkov [Bob96, Proposition A.1], log-concavity on is actually equivalent to concavity of .

Furthermore, as previously remarked, in the one-dimensional log-concave case. The latter identity is still true in higher dimension for the Gaussian measure when the distance is given by the Euclidean norm (see [Bor75]) and this characterizes Gaussian measures. In general, it also holds pointwise.

Take the distance to be given by the sup-norm, . Then, for any set , we have . In this case, Bobkov [Bob96, Theorem 1.1] characterizes symmetric log-concave measures for which .

Reverse relation in higher dimension between and when

is log-concave, is related to the so-called KLS-hyperplane conjecture (see for instance

[LV17, LV18]).

A possible extension of the notion of bi-log-concavity is the following.

Definition 9

Let be a probability measure on . Assume that is symmetric around the origin. Then is said to be weakly bi-log-concave (with respect to the distance ) if the function

is non-increasing on .

The latter definition extends the definition of bi-log-concavity for symmetric measures on the real line. However, we consider that the definition is “weak” since, as we will see, it seems in fact natural to ask for more. In the following, a symmetric measure is a measure that is symmetric around the origin.

Proposition 10

Symmetric log-concave measures on are bi-log-concave (for the Euclidean distance).

Proof. Take

a unit vector and consider the measure

defined to be the projection of the measure on the line containing and directed by . Consequently, is a log-concave measure on , symmetric around zero. Hence is concave and consequently, is nonincreasing. Since half-spaces are parameterized by unitary vectors together with a point on the line containing zero and directed by the considered unitary vector, this readily gives the nonincreasingness of  

One can notice that the latter proof is in fact only based on stability of log-concavity through one-dimensional marginalizations. This naturally leads to the following second definition of bi-log-concavity in higher dimension.

Definition 11

Let be a probability measure on . Then is said to be weakly bi-log-concave if for every line , the (Euclidean) projection measure of onto the line is a (one-dimensional) bi-log-concave measure on (that can be possibly degenerate). More explicitly, for any and any Borel set ,

where is a unit directional vector of the line .

Note that weakly bi-log-concave measures are not necessarily symmetric. In the case of symmetric measures, the notion of weakly bi-log-concavity is actually a strengthening of Definition 9.

Proposition 12

Let be a symmetric, weakly bi-log-concave probability measure on . Then is weakly bi-log-concave.

Proof. By parametrization of half-spaces, we have the following formula, for any ,

Then the conclusion follows by noticing that for any line such that , the projection measure of is symmetric and bi-log-concave. Hence, is non-increasing on and so is  

The following result states that the notion of weakly bi-log-concavity is stable through convolution by log-concave measures.

Proposition 13

The convolution of a log-concave measure with a weakly bi-log-concave one is weakly bi-log-concave.

Proof. The formula shows that the projection of the convolution of two measures on a line is the convolution of the projections of measures on this line. This allows to reduce the stability through convolution by a log-concave measure to dimension one and concludes the proof.  

As for the log-concave case, it is moreover directly seen that weakly and weakly log-concavity are stable by affine transformations of the space.

Actually, in addition to containing log-concave measures and being stable through convolution by a log-concave measure, there are at least two other properties that one would naturally require for a convenient multidimensional concept of bi-log-concavity: existence of a density with respect to the Lebesgue measure on the convex hull of its support and existence of a finite exponential moment for the (Euclidean) norm. We can express this latter remark through the following open problem, that concludes this note.

Open Problem: Find a nice characterization of probability measures on that are weakly bi-log-concave, that admit a density with respect to the Lebesgue measure on the convex hull of their support and whose Euclidean norm has exponentially decreasing tails.

Acknowledgement 14

I express my deepest gratitude to Jon Wellner, who introduced me to the intriguing notion of bi-log-concavity and who provided several numerical computations during his visit at the Crest-Ensai, that helped to understand the convolution problem for bi-log-concave measures. Many thanks also to Jon for his comments on a previous version of this note.

References

  • [BB05] M. Bagnoli and T. Bergstrom, Log-concave probability and its applications, Econom. Theory 26 (2005), no. 2, 445–469. MR MR2213177
  • [BH97] S. G. Bobkov and C. Houdré, Isoperimetric constants for product probability measures, Ann. Probab. 25 (1997), no. 1, 184–205. MR 1428505
  • [Bob96] S. Bobkov, Extremal properties of half-spaces for log-concave distributions, Ann. Probab. 24 (1996), no. 1, 35–48. MR 1387625 (97e:60027)
  • [Bor75] C. Borell, The Brunn-Minkowski inequality in Gauss space, Invent. Math. 30 (1975), no. 2, 207–216. MR 0399402
  • [Col17] A. Colesanti, Log-concave functions, Convexity and concentration, IMA Vol. Math. Appl., vol. 161, Springer, New York, 2017, pp. 487–524. MR 3837280
  • [DKW17] L. Dümbgen, P. Kolesnyk, and R. A. Wilke, Bi-log-concave distribution functions, J. Statist. Plann. Inference 184 (2017), 1–17. MR 3600702
  • [Gué12] O. Guédon, Concentration phenomena in high dimensional geometry, Proceedings of the Journées MAS 2012 (Clermond-Ferrand, France), 2012.
  • [Led01] M. Ledoux, The concentration of measure phenomenon, Mathematical Surveys and Monographs, vol. 89, American Mathematical Society, Providence, RI, 2001. MR 1849347 (2003k:28019)
  • [LV17] Y. T. Lee and S. S. Vempala, Eldan’s stochastic localization and the KLS hyperplane conjecture: An improved lower bound for expansion, 58th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2017, Berkeley, CA, USA, October 15-17, 2017, 2017, pp. 998–1007.
  • [LV18]  , The Kannan-Lovász-Simonovits conjecture, CoRR abs/1807.03465 (2018).
  • [LW17] N. Laha and J. A. Wellner, Bi--oncave istributions, arXiv:1705.00252, preprint.
  • [Pré73] A. Prékopa, On logarithmic concave measures and functions, Acta Sci. Math. (Szeged) 34 (1973), 335–343. MR 0404557 (53 #8357)
  • [Rud87] W. Rudin, Real and complex analysis, third ed., McGraw-Hill Book Co., New York, 1987. MR 924157 (88k:00002)
  • [Sam18] R. J. Samworth,

    Recent progress in log-concave density estimation

    , Statist. Sci. 33 (2018), no. 4, 493–509. MR 3881205
  • [SW14] A. Saumard and J. A. Wellner, Log-concavity and strong log-concavity: A review, Statist. Surv. 8 (2014), 45–114.
  • [Wal09] G. Walther, Inference and modeling with log-concave distributions, Statist. Sci. 24 (2009), no. 3, 319–327. MR 2757433 (2011j:62110)