# Explicit expressions for joint moments of n-dimensional elliptical distributions

Inspired by Stein's lemma, we derive two expressions for the joint moments of elliptical distributions. We use two different methods to derive E[X_1^2f(𝐗)] for any measurable function f satisfying some regularity conditions. Then, by applying this result, we obtain new formulae for expectations of product of normally distributed random variables, and also present simplified expressions of E[X_1^2f(𝐗)] for multivariate Student-t, logistic and Laplace distributions.

## Authors

• 6 publications
• 15 publications
• 17 publications
03/02/2022

### Multivariate doubly truncated moments for generalized skew-elliptical distributions with application to multivariate tail conditional risk measures

In this paper, we focus on multivariate doubly truncated first two momen...
12/03/2019

### Moments of Student's t-distribution: A Unified Approach

In this note, we derive the closed form formulae for moments of Student'...
12/13/2021

### A unified treatment of characteristic functions of symmetric multivariate and related distributions

The purpose of the present paper is to give unified expressions to the c...
11/14/2017

### The mixability of elliptical distributions with supermodular functions

The concept of ϕ-complete mixability and ϕ-joint mixability was first in...
10/16/2019

### Stochastic Orderings of Multivariate Elliptical Distributions

Let X and X be two n-dimensional elliptical random vectors, we establish...
04/30/2019

### A Detailed Analysis of Quicksort Algorithms with Experimental Mathematics

We study several variants of single-pivot and multi-pivot Quicksort algo...
04/02/2022

### A Generalized Family of Exponentiated Composite Distributions

In this paper, we propose a new class of distributions by exponentiating...
##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1 Introduction and motivation

Stein’s (1981) lemma discusses the determination of

for a bivariate normal random vector

, where is any differentiable function with finite extensions . Inspired by the original work of Stein, several and generalizations have appeared in the literature. Liu (1994) generalized the lemma to the multivariate normal case, while Landsman (2006) showed that Stein’s type lemma also holds when is distributed as bivariate elliptical. This result got extended to multivariate elliptical vectors by Landsman and Neslehova (2008); see also Landsman et al.(2013) for a simple proof. Recently, Shushi (2018) derived the multivariate Stein’s lemma for truncated elliptical random vectors.

In this work, we derive expressions of the joint moments for any measurable function satisfying some regularity conditions. In particular, we obtain new formulae for the expectation of product of normally distributed random variables, and also present expressions of for multivariate Student-, logistic and Laplace distributions.

The rest of the paper is organized as follows. Section 2 reviews some definitions and properties of the family of elliptical distributions. Section 3 presents explicit expressions of joint moments of elliptical distributions by a direct method. Section 4 derives these joint moments by the use of Stein’s lemma. Section 5 shows the equivalence of the two expressions under the condition that the scale matrix is positive definite. Section 6 presents expression for expectation of products of correlated normal variables. Section 7 presents simplified results for the special cases of multivariate Student-, logistic and Laplace distributions as illustrative examples of the general result established here. Finally, Section 8 gives some concluding remarks.

## 2 Family of elliptical distributions

Elliptical distributions are generalizations of multivariate normal distribution and possess many novel tractable properties, in addition to allowing fat tails with suitably chosen keernels. This class of distributions was first introduced by Kelker (1970), and has been widely discussed in detail by Fang et al. (1990), and Kotz et al. (2000). A random vector

is said to have an elliptically symmetric distribution if its characteristic function has the form

 E[exp(itTX)]=eitTμϕ(12tTΣt)

for all , denoted , where is called the characteristic generator with , (-dimensional vector) is the location parameter, and ( matrix with ) is the dispersion matrix (or scale matrix). The mean vector (if it exists) coincides with the location vector and the covariance matrix Cov (if it exists) is . The generator of the multivariate normal distribution, for example, is given by .

In general, the elliptical vector may not have a density function. However, if the density exists, then it is of the form

 fX(x)=cn√|Σ|gn(12(x−μ)TΣ−1(x−μ)),x∈Rn, (1)

where is an location vector, is an positive definite scale matrix, and , , is the density generator of . This density generator satisfies the condition

 ∫∞0t(n/2)−1gn(t)dt<∞,

and the normalizing constant is given by

 cn=Γ(n/2)(2π)n/2[∫∞0t(n/2)−1gn(t)dt]−1. (2)

Two important special cases are multivariate normal family with , and multivariate generalized Student- family with , where the parameter and is some constant that may depend on and .

To derive the mixed moments of elliptical distributions, we use the cumulative generators and , which are given by

 ¯¯¯¯Gn(u)=∫∞ugn(v)dv (3)

and

 ¯¯¯Gn(u)=∫∞u¯¯¯¯Gn(v)dv, (4)

respectively (see Landsman et al. (2018)), and the corresponding normalizing constants are

 c∗n=Γ(n/2)(2π)n/2[∫∞0tn/2−1¯¯¯¯Gn(t)dt]−1 (5)

and

 c∗∗n=Γ(n/2)(2π)n/2[∫∞0tn/2−1¯¯¯Gn(t)dt]−1. (6)

Throughout this paper, will denote an -dimensional vector and its transpose. For an matrix , is the determinant of . If is positive definite, then its Cholesky decomposition is known to be unique.

## 3 Direct method of derivation

Consider a random vector with mean vector and positive define matrix . Partition into two parts as each with and components, respectively. By Cholesky decomposition (see Golub and Van Loan (2012), for example), there exists a unique lower triangular matrix such that . In terms of components, we have

 a11=√σ11, ai1=σi1a11, i=2,⋯, n, aik=0,i

Let be a twice continuously differentiable function, and shall use to denote the Hessian matrix of . In addition, we denote

 ∇i,jf(x)=∂2f(x)∂xi∂xj, i,j=1, 2,⋯,n,
 ∇if(x)=∂f(x)∂yi, i=1, 2,⋯,n

and

 ∇f(x)=(∂f(x)∂x1,∂f(x)∂x2,⋯,∂f(x)∂xn)T.

Let and be two elliptical random vectors with generators and , repectively.

The following theorem gives an expression for joint moments of elliptical distributions.

###### Theorem 1.

Let be an -dimensional elliptical random vector with density generator , positive definite matrix , and finite expectation . Further, let be a twice continuously differentiable function satisfying and . Let, in addition,

 lim|x1|→∞x1f(Ax+μ)¯¯¯¯Gn(12xTx)=0 (9)

and

 lim|x1|→∞[∇1f(Ax+μ)]¯¯¯Gn(12xTx)=0. (10)

Then,

 E[X21f(X)] =σ11b∗nE[f(X∗)]+b∗∗nn∑i=1n∑j=1σi1σj1E[∇i,jf(X∗∗)] +2μ1b∗nn∑i=1σi1E[∇if(X∗)]+μ21E[f(X)], (11)

where and .

Proof. By definition,

 E[X21f(X)] =cn√|Σ|∫Rnx21f(x)gn{12(x−μ)TΣ−1(x−μ)}dx =cn√|Σ|∫Rnx21f(x)gn{12(x−μ)T(AAT)−1(x−μ)}dx.

Now, setting , we obtain

 E[X21f(X)]= cn|A|√|Σ|∫Rn(a1,1y1+μ1)2f(Ay+μ)gn{12yTy}dy = cn∫Rn(a1,1y1+μ1)2f(Ay+μ)gn{12yTy}dy = a21,1cn∫Rny21f(Ay+μ)gn{12yTy}dy +2a1,1μ1cn∫Rny1f(Ay+μ)gn{12yTy}dy = a21,1cn∫Rn−1I1dy(2)+2a1,1μ1cn∫Rn−1I2dy(2)

where

 I1 =∫Ry21f(Ay+μ)gn{12yTy}dy1 =−∫Ry1f(Ay+μ)∂∂y1¯¯¯¯Gn{12yTy}dy1 =∫R[f(Ay+μ)+y1∇1f(Ay+μ)]¯¯¯¯Gn{12yTy}dy1 =∫Rf(Ay+μ)¯¯¯¯Gn{12yTy}dy1−∫R∇1f(Ay+μ)∂∂y1¯¯¯Gn{12yTy}dy1 =∫Rf(Ay+μ)¯¯¯¯Gn{12yTy}dy1+∫R∇1,1f(Ay+μ)¯¯¯Gn{12yTy}dy1,

and

 I2 =∫Ry1f(Ay+μ)gn{12yTy}dy1 =−∫Rf(Ay+μ)∂∂y1¯¯¯¯Gn{12yTy}dy1 =∫R∇1f(Ay+μ)¯¯¯¯Gn{12yTy}dy1.

We then obtain

 E[X21f(X)] =a211{b∗nE[f(X∗)]+b∗∗nAT1E(∇i,jf(X∗∗))ni,j=1A1} +2a11μ1b∗nAT1E[∇f(X∗)]+μ21E[f(X)],

where . Now, upon using (7) and (8), we obtain (1), completing the proof of the theorem.

Kan (2008) and Song and Lee (2015) have presented explicit formulae for product moments of multivariate Gaussian random variables. The following corollary presents an explicit expression for product moments of multivariate elliptical random variables, in general.

. Suppose , and . Let be nonnegative integers with . Then, we have

 E[n∏i=1Xpii]=b∗nσ11E[(X∗1)p1−2n∏k=2(X∗k)pk] +2n∑j=2σ11σj1(p1−2)pjE⎡⎣(X∗∗1)p1−3(X∗∗j)pj−1n∏k=2, k≠j(X∗∗k)pk⎤⎦ +n∑j=2σ2j1pj(pj−1)E⎡⎣(X∗∗1)p1−2(X∗∗j)pj−2n∏k=2, k≠j(X∗∗k)pk⎤⎦
 +n∑j=2n∑i=2, i≠jσj1σi1pjpiE⎡⎣(X∗∗i)pi−1(X∗∗j)pj−1(X∗∗1)p1−2n∏k=2, k≠i, j(X∗∗k)pk⎤⎦} +2b∗nμ1{σ11(p1−2)E[(X∗∗1)p1−3n∏k=2(X∗∗k)pk] (12)

From Corollary 1, we readily deduce the following relations for the special case of , for example:

 E[X31n∏i=2Xpii]=b∗nσ11E[X∗1n∏k=2(X∗k)pk]+μ21E[X1n∏k=2Xpkk] +b∗∗n{2n∑j=2σ11σj1pjE⎡⎣(X∗∗j)pj−1n∏k=2, k≠j(X∗∗k)pk⎤⎦ +n∑j=2σ2j1pj(pj−1)E⎡⎣X∗∗1(X∗∗j)pj−2n∏k=2, k≠j(X∗∗k)pk⎤⎦ +n∑j=2n∑i=2, i≠jσj1σi1pjpiE⎡⎣(X∗∗i)pi−1(X∗∗j)pj−1X∗∗1n∏k=2, k≠i, j(X∗∗k)pk⎤⎦}

## 4 Derivation by the use of Stein’s lemma

We now derive an expression for by using Stein’s Lemma. It should be noted that the positive definiteness of is not necessary in this case.

###### Theorem 2.

Suppose , and all the conditions of Theorem 1 hold. Then,

 E[X21f(X)]= n∑i=1n∑j=1Cov(X1,Xi)Cov(X∗1,X∗j)E[∇i,jf(X∗∗)] +2μ1n∑i=1Cov(X1,Xi)E[∇if(X∗)] +Cov(X1,X1)E[f(X∗)]+μ21E[f(X)]. (13)

Proof. By Lemma 2 of Landsman et al. (2013), we have

 Cov(X1,f(X))=n∑i=1Cov(X1,Xi)E[∇if(X∗)],

and so

 E[X1f(X)] =Cov(X1,f(X))+E[X1]E[f(X)] =n∑i=1Cov(X1,Xi)E[∇if(X∗)]+E[X1]E[f(X)]. (14)

Upon replacing by in , we obtain

 E[X21f(X)]=n∑i=1Cov(X1,Xi)E[∇i(X∗1f(X∗))]+E[X1]E[X1f(X)] = n∑i=1Cov(X1,Xi)E[X∗1∇if(X∗)]+Cov(X1,X1)E[f(X∗)]+E[X1]E[X1f(X)] = n∑i=1Cov(X1,Xi){n∑j=1Cov(X∗1,X∗j)E[∇i,jf(X∗∗)]+E[X∗1]E[∇if(X∗)]} +Cov(X1,X1)E[f(X∗)]+E[X1]{n∑i=1Cov(X1,Xi)E[∇if(X∗)]+E[X1]E[f(X)]}
 = n∑i=1n∑j=1Cov(X1,Xi)Cov(X∗1,X∗j)E[∇i,jf(X∗∗)]+(E[X1])2E[f(X)] +2E[X1]n∑i=1Cov(X1,Xi)E[∇if(X∗)]+Cov(X1,X1)E[f(X∗)],

as required.

## 5 Equivalence of the two expressions

We shall now establish the equivalence of the two expressions in (1) and (2) under the condition that is positive definite. For this purpose, we will use the following lemma; see, for example, Fang et al. (1990).

###### Lemma 1.

For any non-negative measurable function , we have

 ∫Rnf(12yTy)dy=(2π)n/2Γ(n/2)∫∞0un/2−1f(u)du.
###### Proposition 1.

Under the conditions in Theorem 1, we have

 cnc∗n=−ϕ′(0),c∗nc∗∗n=−ϕ∗′(0),

where and are the characteristic generating functions corresponding to the density generators and , respectively.

Proof. Let . Then, implies and . It then follows that

 E(YTY) = −nϕ′(0) = cn∫RnyTygn(12yTy)dy = 2cn(2π)n/2Γ(n/2)∫∞0tn/2gn(t)dt,

by Lemma 1. Thus,

 cn=−nϕ′(0)2(2π)n/2Γ(n/2)∫∞0tn/2gn(t)dt. (15)

Now, by using Eq.(9) of Landsman et al. (2013), we have

 1=c∗n∫Rn¯¯¯¯Gn(12yTy)dy=c∗n(2π)n/2Γ(n/2+1)∫∞0tn/2gn(t)dt,

and hence

 c∗n=1(2π)n/2Γ(n/2+1)∫∞0tn/2gn(t)dt. (16)

The result that readily follows from (15) and (16).

For , let . Then, and . By using the same arguments as above, we have

 E(ZTZ) = −nϕ∗′(0) = c∗n∫RnzTz¯¯¯¯Gn(12zTz)dz = 2c∗n(2π)n/2Γ(n/2)∫∞0tn/2¯¯¯¯Gn(t)dt = nc∗n(2π)n/2Γ(n/2)∫∞0tn/2−1¯¯¯Gn(t)dt = nc∗nc∗∗n,

and hence

 c∗nc∗∗n=−ϕ∗′(0),

as required.

###### Remark 1.

From Proposition 1, we find that, when is a positive definite, the expressions in (1) and (2) are indeed equivalent. Moreover, for a positive semidefinite matrix , we can rewrite (2) in terms of characteristic generators as follows:

 E[X21f(X)]= ϕ′(0)ϕ∗′(0)n∑i=1n∑j=1σ1iσ1jE[∇i,jf(X∗∗)] −2μ1ϕ′(0)n∑i=1σ1iE[∇if(X∗)] −ϕ′(0)σ11E[f(X∗)]+μ21E[f(X)], (17)

where and are the characteristic generators corresponding to the density generators and , respectively.

## 6 Product moments of correlated normal random variables

We derive now expressions for of multivariate normal distribution and moments of products of correlated normal distribution.

. Suppose with density generator and characteristic generator . Note in this case that , and . Then, we have

 E[X21f(X)]= n∑i=1n∑j=1σ1iσ1jE[∇i,jf(X)]+2μ1n∑i=1σ1iE[∇if(X)] +σ11E[f(X)]+μ21E[f(X)].

In general, for any , we have the following recursive relation:

 E[Xp11f(X)]=n∑i=1n∑j=1σ1iσ1jE[∇i,j(Xp1−21f(X))] +2μ1n∑i=1σ1iE[∇i(Xp1−21f(X)]+σ11E[Xp1−21f(X)]+μ21E[Xp1−21f(X)] =n∑i=1n∑j=1σ1iσ1jE[Xp1−21∇i,jf(X)]+σ211(p1−2)(p1−3)E[Xp1−4