# Inverse Gaussian quadrature and finite normal-mixture approximation of generalized hyperbolic distribution

In this study, a numerical quadrature for the generalized inverse Gaussian distribution is derived from the Gauss--Hermite quadrature by exploiting its relationship with the normal distribution. Unlike Gaussian quadrature, the proposed quadrature exactly evaluates both positive and negative moments, thus improving evaluation accuracy. The generalized hyperbolic distribution is efficiently approximated as a finite normal variance-mean mixture with the quadrature. Therefore, the expectations under the distribution, such as cumulative distribution function and option price, are accurately computed as weighted sums of those under normal distributions.

## Authors

• 5 publications
• 1 publication
• 1 publication
11/26/2013

### A Mixture of Generalized Hyperbolic Factor Analyzers

Model-based clustering imposes a finite mixture modelling structure on d...
04/14/2020

### Universal Approximation on the Hypersphere

It is well known that any continuous probability density function on R^m...
02/22/2022

### Evaluation of the Gauss integral

The normal or Gaussian distribution plays a prominent role in almost all...
10/04/2018

### Gaussian approximation of Gaussian scale mixture

For a given positive random variable V>0 and a given Z∼ N(0,1) independe...
04/02/2019

### BCMA-ES II: revisiting Bayesian CMA-ES

This paper revisits the Bayesian CMA-ES and provides updates for normal ...
09/07/2018

### Mixtures of Skewed Matrix Variate Bilinear Factor Analyzers

Clustering is the process of finding and analyzing underlying group stru...
07/30/2020

### A Vecchia Approximation for High-Dimensional Gaussian Cumulative Distribution Functions Arising from Spatial Data

We introduce an approach to quickly and accurately approximate the cumul...
##### This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

## 1. Introduction

The inverse Gaussian distribution (Folks and Chhikara, 1978), , has the density function

 f\textscig(x|γ,δ)=δ√2πx3exp(−(γx−δ)22x)forγ≥0,δ>0.

The first passage time of a drifted Brownian motion, , to a level, , is distributed by . The term, inverse, refers to the time of Brownian motion at a fixed location whereas the Gaussian distribution refers to the location at a fixed time. It is further generalized to the generalized inverse Gaussian distribution, , with density

 f\textscgig(x|γ,δ,p)=(γ/δ)pxp−12Kp(γδ)exp(−γ2x2+δ22x),

where is the modified Bessel function of the second kind with index . With , it can be shown that . The generalized inverse Gaussian distribution has the scaling property: if and , then . Therefore, any statement for can be easily generalized to .

When is used as the mixing distribution of the normal variance-mean mixture,

 Y=μ+βX+√XZfor standard normal variateZ, (1)

the generalized hyperbolic distribution, , is obtained with density

 f\textscgh(y|μ,β,γ,δ,p)=√α(γ/αδ)p√2πKp(δγ)eβ(y−μ)Kp−\sfrac12(α√δ2+(y−μ)2)(δ2+(y−μ)2)(1−2p)/4,α=√β2+γ2.

As the name suggests, it generalizes the hyperbolic distribution, the case, which was originally studied for the sand particle size distributions (Barndorff-Nielsen, 1977). Later, the generalized hyperbolic distribution was applied to finance (Prause, 1999; Eberlein and Prause, 2002). In particular, the normal inverse Gaussian distribution, the case, draws attention as the most useful case of the distribution owing to its better probabilistic properties (Barndorff-Nielsen, 1997a, b) and superior fit to empirical stock return distribution (Prause, 1999; Kalemanova et al., 2007).

Despite the importance, the evaluation of the generalized hyperbolic distribution is not trivial. Firstly, the cumulative distribution has no closed-form expression, and thus has to resort to ad-hoc numerical integration of the density function (Scott, 2018). Secondly, despite the success in financial application, an efficient option pricing method has not been reported in previous studies (Rejman et al., 1997; Ivanov, 2013).

This study proposes a novel and efficient method to approximate the generalized hyperbolic distribution as a finite normal variance-mean mixture. Therefore, any expectation under is reduced to that under normal distribution for which analytic and numerical procedures are possibly available. The finite mixture is obtained by constructing a new numerical quadrature for the generalized inverse Gaussian distribution—the mixing distribution—by exploiting its relationship with the normal distribution. Unlike the Gauss–Hermite quadrature for the normal distribution, the proposed quadrature is capable of exactly evaluating both positive and negative moments. The accuracy and efficiency of this method are illustrated with numerical examples.

## 2. Numerical quadrature for mixing distribution

The Gaussian quadrature with respect to the weight function and the interval is the abscissas, , and weights, , for , that best approximate the integral of a given function as

 ∫bag(x)w(x)dx≈n∑k=1g(xk)wk.

The points and weights are the most optimal in that they exactly evaluate the integral when is a polynomial up to degree , including the desired condition, for . It is known that are the roots of the th order orthogonal polynomial, , with respect to and , and

are given as the integral of the Lagrange interpolation polynomial

 wk=1p′n(xk)∫bapn(x)x−xkw(x)dx.

The Gaussian quadratures have been found for several well-known probability densities

: Gauss–Legendre quadrature for uniform distribution, Gauss–Jacobi for beta, and Gauss–Laguerre for exponential. In particular, this study heavily depends on the Gauss–Hermite quadrature for the normal distribution. In the rest of the paper, the Gauss–Hermite quadrature is always defined with respect to the standard normal density, rather than

. Therefore, the orthogonal polynomials are the probabilists’ Hermite polynomials denoted by , not the physicists’ Hermite polynomials denoted by .

If an accurate quadrature, and , is known for the mixing distribution for the normal variance-mean mixture in Eq. (1), any expectation regarding can be efficiently computed because can be approximated as a finite mixture of normal distributions with mean and variance . For example, the cumulative distribution of can be approximated as the weighed sum of those of the normal distribution

 F\textscgh(y)=pr(Y

where is the cumulative distribution of the standard normal. This approximation is particularly well suited as the cumulative distribution function because the value is monotonically increasing from 0 to 1 because and . If a stock price follows the log-generalized hyperbolic distribution, the call option price can be approximated as a weighted sum of the Black–Scholes formulas with varying spot prices and volatilities

 C\textscgh(K)=E(max(eY−K,0))≈n∑k=1wk(FkN(dk+√xk)−KN(dk)),whereFk=eμ+(β+\sfrac12)xk,dk=log(Fk/K)√xk−√xk2, (3)

which is computationally efficient.

## 3. Quadrature for (Generalized) Inverse Gaussian

With the change in variable, , the exponential term of becomes the standard normal density in . This mapping plays an important role in understanding this study as well as the previously known properties of the inverse Gaussian distribution. We define the mapping appropriately and derive a useful lemma.

###### Definition 1.

Let be a monotonically increasing one-to-one mapping from to , and be the inverse mapping, respectively, defined as

 z=ϕσ(x)=σ(√x−1√x)% andx=ϕ−1σ(z)=1+z22σ2+zσ√1+z24σ2.
###### Lemma 1.

The mapping, , relates the density, , and the standard normal density, , as follows:

 f\textscig(x|σ,σ)1+x2dx=n(z)dz. (4)
###### Proof.

The proof is trivial from the differentiation, . ∎

With this lemma, two important results about the inverse Gaussian distribution can be proved. Let and be for so that and . For standard normal variable, , and the three related variables, , , and ,

 pr(X∈dx+)+pr(X∈dx−)=2pr(Z∈dz)1+x++2pr(Z∈dz)1+x−=2pr(Z∈dz). (5)

It follows that

 pr(ϕσ(X)2

Thus, is distributed as  (Shuster, 1968). Equation (5) also implies that choosing between the two random values, , with probabilities, (), respectively, is an exact sampling method of  (Michael et al., 1976). The lemma also provides an intuition for constructing the numerical quadrature for the inverse Gaussian distribution.

###### Theorem 1 (Inverse Gaussian Quadrature).

Let and be the points and the weights, respectively, of the Gauss–Hermite quadrature from the th order Hermite polynomial . Then, the points and the weights , defined by

 xk=ϕ−1σ(zk)andwk=2hk1+xk,

serve as a numerical quadrature with respect to over the domain . The corresponding orthogonal functions are . The quadrature exactly evaluates the th moments for .

###### Proof.

We prove the theorem in two steps. In the first step, we show that and serve as a quadrature with respect to , which is a straightforward result from Lemma 1. First, the functions are orthogonal because

 ∫∞0Gn(x)Gn′(x)f\textscig(x|σ,σ)1+x2dx=∫∞−∞Hen(z)Hen′(z)n(z)dz=0ifn≠n′.

Second, are the roots of as . Finally, the weight is invariant under the mapping :

 ϕ′σ(xk)G′n(xk)∫∞0Gn(x)ϕσ(x)−zkf\textscig(x|σ,σ)1+x2dx=1He′n(zk)∫∞−∞Hen(z)z−zkn(z)dz=hk.

In the second step, we prove the statement about the moments. The change in variable, yields and for . Therefore, using the previous step, the th moment for integer is cast into the integration with the Gauss–Hermite quadrature:

 E(Xr)=12E(Xr+X1−r)=E(1+X2θr(X))≈n∑k=11+xk2θr(xk)wk=n∑k=1θr(ϕ−1σ(zk))hk,

where for and for . It can be shown that is an order polynomial of . Let be the th order Chebyshev polynomials of the first kind, satisfying the property, . With and ,

 xj+x−j=2cosh(jy)=2Tj(cosh(y))=2Tj(z22σ2−1).

Therefore, is a linear combination of for , thereby an order polynomial of . It follows that the quadrature integration of the th moment is exact for as well as from the symmetry . ∎

The following comments can be made on the new quadrature. First, the orthogonal functions, , are not polynomials of ; therefore, the quadrature is not a Gaussian quadrature. Given below are a few first orders of for obtained from :

 G0(x) =1, He0(z) =1 G1(x) =x−1√x, He1(z) =z G2(x) =x2−3x+1x, He2(z) =z2−1 G3(x) =(x−1)(x2−4x+1)x√x, He3(z) =z3−3z

Nevertheless, the quadrature can evaluate both positive and negative moments, which is useful for the applications in the following section. Second, we name the quadrature as inverse Gaussian quadrature after the distribution name. Here, the term inverse additionally conveys the meaning that it is not a Gaussian quadrature and can accurately evaluate the inverse moments. Lastly, the construction of the quadrature is intuitively realized as Michael et al. (1976)’s method applied to the discretized

normal random variable,

with probabilities , instead of the true normal random variable. This study is originally motivated by this observation.

From the observation of the density functions, and , which are related by

 f\textscgig(x|σ,σ,p)=c(σ,p)xp+\sfrac12f\textscig(x|σ,σ)wherec(σ,p)=K−\sfrac12(σ2)Kp(σ2),

we further generalize the quadrature to the generalized inverse Gaussian distribution.

###### Corollary 1 (Generalized Inverse Gaussian Quadrature).

Let and be inverse Gaussian quadrature with respect to . Then, and defined by serve as a quadrature with respect to . The quadrature exactly evaluates the th moment for for .

###### Proof.

The modified weights are obtained from for a function , where , . The statement about moments is also a direct consequence of the relation, . ∎

Note that if is not an integer, is not guaranteed; therefore, it is recommended to scale by the factor of to ensure . However, the numerical experiments in the next section show that the amount of adjustment is very small for reasonably large .

## 4. Numerical examples

We test the proposed quadrature numerically. First, we evaluate the moments of the inverse Gaussian distribution as a function of the order . The moment of has a close-form expression

 E(¯Xr)=Kr+p(σ2)Kp(σ2),

against which the error of the quadrature evaluation can be measured. Figure 1 shows the of the relative error of for when evaluated with and . As Theorem 1 predicts, the quadrature exactly evaluates the moments for integer from to . The error for non-integer is also reasonably small when . The relative error of can also be interpreted as that of for ; thus, the sum of is similarly close to 1.

Next, we evaluate the cumulative distribution function of the generalized hyperbolic distribution using Eq. (2). We test the normal inverse Gaussian () and the hyperbolic distribution (). Figure 2 depicts the errors as functions of the quadrature size . The exact values are obtained with the GeneralizedHyperbolic R package based on adaptive quadrature integration (Scott, 2018). Despite that the expectation involves ill-behaving terms such as and , the quadrature evaluation shows good accuracy; the error quickly converges to the order of around for both cases although the convergence rate is slower than exponential. The good accuracy is attributed to the accurate handling of both positive and negative moments by the generalized inverse Gaussian quadrature. As previously confirmed, the correction on the weights, , has negligible effect on the error amount.

## 5. Discussion

We believe that this study facilitates subsequent research in a few directions. First, with the efficient option pricing method in Eq. (3

), it is of interest to examine how well the generalized hyperbolic distribution fits to the return distribution implied from the observed volatility smile, which is usually different from the distribution from the historical returns. Second, the expectation-maximization algorithm for the generalized hyperbolic distribution

(Karlis, 2002) can be improved as it can be performed in the context of the finite normal mixture with constraints. Third, the generalized inverse Gaussian quadrature, combined with spline interpolation, can be used as a method for generating random numbers for and with , alternative to the method of Dagpunar (1989).

## References

• Barndorff-Nielsen (1977) Ole Barndorff-Nielsen. Exponentially decreasing distributions for the logarithm of particle size. Proc. R. Soc. Lond. A, 353(1674):401–419, 1977.
• Barndorff-Nielsen (1997a) Ole E Barndorff-Nielsen. Processes of normal inverse Gaussian type. Finance and Stochastics, 2(1):41–68, 1997a.
• Barndorff-Nielsen (1997b) Ole E Barndorff-Nielsen. Normal inverse Gaussian distributions and stochastic volatility modelling. Scandinavian Journal of Statistics, 24(1):1–13, 1997b.
• Dagpunar (1989) JS Dagpunar. An easily implemented generalised inverse Gaussian generator. Communications in Statistics-Simulation and Computation, 18(2):703–710, 1989.
• Eberlein and Prause (2002) Ernst Eberlein and Karsten Prause. The generalized hyperbolic model: financial derivatives and risk measures. In Mathematical Finance—Bachelier Congress 2000, pages 245–267. Springer, 2002.
• Folks and Chhikara (1978) J Leroy Folks and Raj S Chhikara. The inverse Gaussian distribution and its statistical application–a review. Journal of the Royal Statistical Society. Series B (methodological), pages 263–289, 1978.
• Ivanov (2013) Roman V Ivanov. Closed form pricing of european options for a family of normal-inverse Gaussian processes. Stochastic Models, 29(4):435–450, 2013.
• Kalemanova et al. (2007) Anna Kalemanova, Bernd Schmid, Ralf Werner, et al. The normal inverse Gaussian distribution for synthetic CDO pricing. Journal of Derivatives, 14(3):80, 2007.
• Karlis (2002) Dimitris Karlis.

An EM type algorithm for maximum likelihood estimation of the normal-inverse Gaussian distribution.

Statistics & probability letters, 57(1):43–52, 2002.
• Michael et al. (1976) John R Michael, William R Schucany, and Roy W Haas. Generating random variates using transformations with multiple roots. The American Statistician, 30(2):88–90, 1976.
• Prause (1999) Karsten Prause. The generalized hyperbolic model: Estimation, financial derivatives and risk measures. PhD thesis, University of Freiburg, 1999.
• Rejman et al. (1997) Aleksander Rejman, Weron Aleksander, and Rafal Weron. Option pricing proposals under the generalized hyperbolic model. Communications in Statistics. Stochastic Models, 13(4):867–885, 1997.
• Scott (2018) David Scott. GeneralizedHyperbolic: The Generalized Hyperbolic Distribution, 2018. R package version 0.8-4.
• Shuster (1968) Jonathan Shuster. On the inverse Gaussian distribution function. Journal of the American Statistical Association, 63(324):1514–1516, 1968.