An efficient algorithm for sampling from sin^k(x) for generating random correlation matrices

09/14/2018 ∙ by Enes Makalic, et al. ∙ 0

In this note, we develop a novel algorithm for generating random numbers from a distribution with probability density function proportional to sin^k(x), x ∈ (0,π) and k ≥ 1. Our algorithm is highly efficient and is based on rejection sampling where the envelope distribution is an appropriately chosen beta distribution. An example application illustrating how the new algorithm can be used to generate random correlation matrices is discussed.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Consider a positive random variable

with probability density function

(1)

where the normalization constant is given by

This is a symmetric continuous probability density function for which the mean (mode and median) are

for all . Generating random numbers from this distribution is not straightforward as the corresponding cumulative density function, although available in closed form, is defined recursively and requires operations to evaluate. The nature of the cumulative density function makes any procedure based on inverse transform sampling computationally inefficient, especially for large . Furthermore, an alternative approach based on numerical integration is not recommended due to numerical issues with the probability density function when is large.

In this note, we develop a novel algorithm (see Section 2) for generating random numbers from the distribution defined in (1). Our algorithm is based on rejection sampling [1] (pp. 50–53) where the envelope distribution is a particular beta distribution. We compute the exact unconditional acceptance probability of the proposed algorithm and show that the maximum expected average number of iterations per sample is for all and , which highlights the efficiency of the proposed algorithm. Lastly, we use our novel rejection sampler to implement the algorithm for generating high dimensional correlation matrices based on the hyperspherical parametarization (see Section 3) proposed by Pourahmadi and Wang [2]. Our implementation of the correlation matrix sampling algorithm for the Matlab and R numerical computing platforms is available for download from Matlab Central (File ID: 68810) and CRAN (the randcorr package).

2 Rejection sampling algorithm

The proposed algorithm for sampling from density (1) utilises rejection sampling where the envelope distribution is an appropriately scaled symmetric distribution with probability density function

(2)

where is the Euler beta function. Theorem 1 proves that is an envelope density for this problem and gives the expected average sampling efficiency for all . The proof of the theorem is given in Appendix A.

Theorem 1.

Let denote the probability density function given in (1) and let denote a random variable with probability density function (2). Then,

(3)

where is the upper-bound of the likelihood ratio and is given by

(4)

Based on Theorem 1, our algorithm to generate a random variable with probability density function (1) is given below:

  1. Generate

  2. Generate

    from the uniform distribution

  3. Accept if

  4. Otherwise, return to Step 1.

The average number of iterations required to obtain a sample from the distribution (1) is given by (4). This is a strictly increasing function of where the first few values are

The maximum of is given by

which emphasises the efficiency of the proposed algorithm. In particular, for any value of , the maximum expected average number of iterations per sample is .

3 Application

The proposed rejection sampling algorithm is suitable for applications requiring repeated sampling from density (1), especially for large . An example application is the problem of random sampling of a correlation matrix , which may be high dimensional. For this problem, we implement the procedure outlined in [2] which consists of two steps: (i) generate angles from the distribution (1) where ; and (ii) compute a lower triangular matrix (equation (1) in [2]) from the angles and then the corresponding correlation matrix (see [2] for details).

The procedure of Pourahmadi and Wang requires a computationally efficient method for sampling of the angles . This is especially true for high dimensional correlation matrices where (and therefore ) is large and the number of random variates required is of the order . Using our rejection sampling algorithm, we have implemented the Pourahmadi and Wang sampler in the Matlab and R numerical computing environments. Our program can generate a random correlation matrix (i.e., 4,950 samples of ) in 0.02s, and a random (i.e., 499,500 samples of ) correlation matrix in 0.5s on a commodity laptop. In contrast, the inverse transform sampling approach implemented in [2] and running on comparable computing hardware, requires approximately and to generate a and a correlation matrix, respectively.

Appendix A Proof of Theorem 1

Proof.

We wish to show that , for all and . First, note that

The function

is an interpolation polynomial at the nodes

and for the function for all  [3]. The interpolation error for the polynomial and some is

as the only term that is negative is . The inequality is strict everywhere except the boundary points () and the maximum of and (). ∎

References

  • [1] C. P. Robert, G. Casella, Monte Carlo Statistical Methods, Springer, 2004.
  • [2] M. Pourahmadi, X. Wang, Distribution of random correlation matrices: Hyperspherical parameterization of the Cholesky factor, Statistics & Probability Letters 106 (2015) 5–12.
  • [3] M. R. (user:42969), How to prove for all ?, Mathematics, https://math.stackexchange.com/q/2030069 (2016).