A tool set for random number generation on GPUs in R

01/17/2022
by   Ruoyong Xu, et al.
0

We introduce the R package clrng which leverages the gpuR package and is able to generate random numbers in parallel on a Graphics Processing Unit (GPU) with the clRNG (OpenCL) library. Parallel processing with GPU's can speed up computationally intensive tasks, which when combined with R, it can largely improve R's downsides in terms of slow speed, memory usage and computation mode. clrng enables reproducible research by setting random initial seeds for streams on GPU and CPU, and can thus accelerate several types of statistical simulation and modelling. The random number generator in clrng guarantees independent parallel samples even when R is used interactively in an ad-hoc manner, with sessions being interrupted and restored. This package is portable and flexible, developers can use its random number generation kernel for various other purposes and applications.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 19

11/29/2018

Data-parallel distributed training of very large models beyond GPU capacity

GPUs have limited memory and it is difficult to train wide and/or deep m...
08/12/2021

Communication-free and Parallel Simulation of Neutral Biodiversity Models

We present a novel communication-free algorithm for individual-based pro...
05/15/2013

Augmenting Operating Systems With the GPU

The most popular heterogeneous many-core platform, the CPU+GPU combinati...
06/20/2019

Performance Comparison Between OpenCV Built in CPU and GPU Functions on Image Processing Operations

Image Processing is a specialized area of Digital Signal Processing whic...
03/16/2022

Concurrent CPU-GPU Task Programming using Modern C++

In this paper, we introduce Heteroflow, a new C++ library to help develo...
03/25/2021

CUDA Tutorial – Cryptanalysis of Classical Ciphers Using Modern GPUs and CUDA

CUDA (formerly an abbreviation of Compute Unified Device Architecture) i...
11/08/2016

Fractal Art Generation using GPUs

Fractal image generation algorithms exhibit extreme parallelizability. U...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.