1 Introduction and motivation
Stein’s (1981) lemma discusses the determination of
for a bivariate normal random vector, where is any differentiable function with finite extensions . Inspired by the original work of Stein, several and generalizations have appeared in the literature. Liu (1994) generalized the lemma to the multivariate normal case, while Landsman (2006) showed that Stein’s type lemma also holds when is distributed as bivariate elliptical. This result got extended to multivariate elliptical vectors by Landsman and Neslehova (2008); see also Landsman et al.(2013) for a simple proof. Recently, Shushi (2018) derived the multivariate Stein’s lemma for truncated elliptical random vectors.
In this work, we derive expressions of the joint moments for any measurable function satisfying some regularity conditions. In particular, we obtain new formulae for the expectation of product of normally distributed random variables, and also present expressions of for multivariate Student-, logistic and Laplace distributions.
The rest of the paper is organized as follows. Section 2 reviews some definitions and properties of the family of elliptical distributions. Section 3 presents explicit expressions of joint moments of elliptical distributions by a direct method. Section 4 derives these joint moments by the use of Stein’s lemma. Section 5 shows the equivalence of the two expressions under the condition that the scale matrix is positive definite. Section 6 presents expression for expectation of products of correlated normal variables. Section 7 presents simplified results for the special cases of multivariate Student-, logistic and Laplace distributions as illustrative examples of the general result established here. Finally, Section 8 gives some concluding remarks.
2 Family of elliptical distributions
Elliptical distributions are generalizations of multivariate normal distribution and possess many novel tractable properties, in addition to allowing fat tails with suitably chosen keernels. This class of distributions was first introduced by Kelker (1970), and has been widely discussed in detail by Fang et al. (1990), and Kotz et al. (2000). A random vector
is said to have an elliptically symmetric distribution if its characteristic function has the form
for all , denoted , where is called the characteristic generator with , (-dimensional vector) is the location parameter, and ( matrix with ) is the dispersion matrix (or scale matrix). The mean vector (if it exists) coincides with the location vector and the covariance matrix Cov (if it exists) is . The generator of the multivariate normal distribution, for example, is given by .
In general, the elliptical vector may not have a density function. However, if the density exists, then it is of the form
where is an location vector, is an positive definite scale matrix, and , , is the density generator of . This density generator satisfies the condition
and the normalizing constant is given by
Two important special cases are multivariate normal family with , and multivariate generalized Student- family with , where the parameter and is some constant that may depend on and .
To derive the mixed moments of elliptical distributions, we use the cumulative generators and , which are given by
respectively (see Landsman et al. (2018)), and the corresponding normalizing constants are
Throughout this paper, will denote an -dimensional vector and its transpose. For an matrix , is the determinant of . If is positive definite, then its Cholesky decomposition is known to be unique.
3 Direct method of derivation
Consider a random vector with mean vector and positive define matrix . Partition into two parts as each with and components, respectively. By Cholesky decomposition (see Golub and Van Loan (2012), for example), there exists a unique lower triangular matrix such that . In terms of components, we have
Let be a twice continuously differentiable function, and shall use to denote the Hessian matrix of . In addition, we denote
Let and be two elliptical random vectors with generators and , repectively.
The following theorem gives an expression for joint moments of elliptical distributions.
Let be an -dimensional elliptical random vector with density generator , positive definite matrix , and finite expectation . Further, let be a twice continuously differentiable function satisfying and . Let, in addition,
where and .
Proof. By definition,
Now, setting , we obtain
We then obtain
Kan (2008) and Song and Lee (2015) have presented explicit formulae for product moments of multivariate Gaussian random variables. The following corollary presents an explicit expression for product moments of multivariate elliptical random variables, in general.
. Suppose , and . Let be nonnegative integers with . Then, we have
From Corollary 1, we readily deduce the following relations for the special case of , for example:
4 Derivation by the use of Stein’s lemma
We now derive an expression for by using Stein’s Lemma. It should be noted that the positive definiteness of is not necessary in this case.
Suppose , and all the conditions of Theorem 1 hold. Then,
Proof. By Lemma 2 of Landsman et al. (2013), we have
Upon replacing by in , we obtain
5 Equivalence of the two expressions
We shall now establish the equivalence of the two expressions in (1) and (2) under the condition that is positive definite. For this purpose, we will use the following lemma; see, for example, Fang et al. (1990).
For any non-negative measurable function , we have
Under the conditions in Theorem 1, we have
where and are the characteristic generating functions corresponding to the density generators and , respectively.
Proof. Let . Then, implies and . It then follows that
by Lemma 1. Thus,
Now, by using Eq.(9) of Landsman et al. (2013), we have
For , let . Then, and . By using the same arguments as above, we have
From Proposition 1, we find that, when is a positive definite, the expressions in (1) and (2) are indeed equivalent. Moreover, for a positive semidefinite matrix , we can rewrite (2) in terms of characteristic generators as follows:
where and are the characteristic generators corresponding to the density generators and , respectively.
6 Product moments of correlated normal random variables
We derive now expressions for of multivariate normal distribution and moments of products of correlated normal distribution.
. Suppose with density generator and characteristic generator . Note in this case that , and . Then, we have
In general, for any , we have the following recursive relation: