Explicit expressions for joint moments of n-dimensional elliptical distributions

Inspired by Stein's lemma, we derive two expressions for the joint moments of elliptical distributions. We use two different methods to derive E[X_1^2f(𝐗)] for any measurable function f satisfying some regularity conditions. Then, by applying this result, we obtain new formulae for expectations of product of normally distributed random variables, and also present simplified expressions of E[X_1^2f(𝐗)] for multivariate Student-t, logistic and Laplace distributions.

READ FULL TEXT VIEW PDF

Authors

page 1

page 2

page 3

page 4

12/03/2019

Moments of Student's t-distribution: A Unified Approach

In this note, we derive the closed form formulae for moments of Student'...
12/13/2021

A unified treatment of characteristic functions of symmetric multivariate and related distributions

The purpose of the present paper is to give unified expressions to the c...
11/14/2017

The mixability of elliptical distributions with supermodular functions

The concept of ϕ-complete mixability and ϕ-joint mixability was first in...
10/16/2019

Stochastic Orderings of Multivariate Elliptical Distributions

Let X and X be two n-dimensional elliptical random vectors, we establish...
04/30/2019

A Detailed Analysis of Quicksort Algorithms with Experimental Mathematics

We study several variants of single-pivot and multi-pivot Quicksort algo...
04/02/2022

A Generalized Family of Exponentiated Composite Distributions

In this paper, we propose a new class of distributions by exponentiating...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction and motivation

Stein’s (1981) lemma discusses the determination of

for a bivariate normal random vector

, where is any differentiable function with finite extensions . Inspired by the original work of Stein, several and generalizations have appeared in the literature. Liu (1994) generalized the lemma to the multivariate normal case, while Landsman (2006) showed that Stein’s type lemma also holds when is distributed as bivariate elliptical. This result got extended to multivariate elliptical vectors by Landsman and Neslehova (2008); see also Landsman et al.(2013) for a simple proof. Recently, Shushi (2018) derived the multivariate Stein’s lemma for truncated elliptical random vectors.

In this work, we derive expressions of the joint moments for any measurable function satisfying some regularity conditions. In particular, we obtain new formulae for the expectation of product of normally distributed random variables, and also present expressions of for multivariate Student-, logistic and Laplace distributions.

The rest of the paper is organized as follows. Section 2 reviews some definitions and properties of the family of elliptical distributions. Section 3 presents explicit expressions of joint moments of elliptical distributions by a direct method. Section 4 derives these joint moments by the use of Stein’s lemma. Section 5 shows the equivalence of the two expressions under the condition that the scale matrix is positive definite. Section 6 presents expression for expectation of products of correlated normal variables. Section 7 presents simplified results for the special cases of multivariate Student-, logistic and Laplace distributions as illustrative examples of the general result established here. Finally, Section 8 gives some concluding remarks.

2 Family of elliptical distributions

Elliptical distributions are generalizations of multivariate normal distribution and possess many novel tractable properties, in addition to allowing fat tails with suitably chosen keernels. This class of distributions was first introduced by Kelker (1970), and has been widely discussed in detail by Fang et al. (1990), and Kotz et al. (2000). A random vector

is said to have an elliptically symmetric distribution if its characteristic function has the form

for all , denoted , where is called the characteristic generator with , (-dimensional vector) is the location parameter, and ( matrix with ) is the dispersion matrix (or scale matrix). The mean vector (if it exists) coincides with the location vector and the covariance matrix Cov (if it exists) is . The generator of the multivariate normal distribution, for example, is given by .

In general, the elliptical vector may not have a density function. However, if the density exists, then it is of the form

(1)

where is an location vector, is an positive definite scale matrix, and , , is the density generator of . This density generator satisfies the condition

and the normalizing constant is given by

(2)

Two important special cases are multivariate normal family with , and multivariate generalized Student- family with , where the parameter and is some constant that may depend on and .

To derive the mixed moments of elliptical distributions, we use the cumulative generators and , which are given by

(3)

and

(4)

respectively (see Landsman et al. (2018)), and the corresponding normalizing constants are

(5)

and

(6)

Throughout this paper, will denote an -dimensional vector and its transpose. For an matrix , is the determinant of . If is positive definite, then its Cholesky decomposition is known to be unique.

3 Direct method of derivation

Consider a random vector with mean vector and positive define matrix . Partition into two parts as each with and components, respectively. By Cholesky decomposition (see Golub and Van Loan (2012), for example), there exists a unique lower triangular matrix such that . In terms of components, we have

(7)
(8)

Let be a twice continuously differentiable function, and shall use to denote the Hessian matrix of . In addition, we denote

and

Let and be two elliptical random vectors with generators and , repectively.

The following theorem gives an expression for joint moments of elliptical distributions.

Theorem 1.

Let be an -dimensional elliptical random vector with density generator , positive definite matrix , and finite expectation . Further, let be a twice continuously differentiable function satisfying and . Let, in addition,

(9)

and

(10)

Then,

(11)

where and .

Proof. By definition,

Now, setting , we obtain

where

and

We then obtain

where . Now, upon using (7) and (8), we obtain (1), completing the proof of the theorem.

Kan (2008) and Song and Lee (2015) have presented explicit formulae for product moments of multivariate Gaussian random variables. The following corollary presents an explicit expression for product moments of multivariate elliptical random variables, in general.

. Suppose , and . Let be nonnegative integers with . Then, we have

(12)

From Corollary 1, we readily deduce the following relations for the special case of , for example:

4 Derivation by the use of Stein’s lemma

We now derive an expression for by using Stein’s Lemma. It should be noted that the positive definiteness of is not necessary in this case.

Theorem 2.

Suppose , and all the conditions of Theorem 1 hold. Then,

(13)

Proof. By Lemma 2 of Landsman et al. (2013), we have

and so

(14)

Upon replacing by in , we obtain

as required.

5 Equivalence of the two expressions

We shall now establish the equivalence of the two expressions in (1) and (2) under the condition that is positive definite. For this purpose, we will use the following lemma; see, for example, Fang et al. (1990).

Lemma 1.

For any non-negative measurable function , we have

Proposition 1.

Under the conditions in Theorem 1, we have

where and are the characteristic generating functions corresponding to the density generators and , respectively.

Proof. Let . Then, implies and . It then follows that

by Lemma 1. Thus,

(15)

Now, by using Eq.(9) of Landsman et al. (2013), we have

and hence

(16)

The result that readily follows from (15) and (16).

For , let . Then, and . By using the same arguments as above, we have

and hence

as required.

Remark 1.

From Proposition 1, we find that, when is a positive definite, the expressions in (1) and (2) are indeed equivalent. Moreover, for a positive semidefinite matrix , we can rewrite (2) in terms of characteristic generators as follows:

(17)

where and are the characteristic generators corresponding to the density generators and , respectively.

6 Product moments of correlated normal random variables

We derive now expressions for of multivariate normal distribution and moments of products of correlated normal distribution.

. Suppose with density generator and characteristic generator . Note in this case that , and . Then, we have

In general, for any , we have the following recursive relation: