A quasi-Monte Carlo data compression algorithm for machine learning

04/06/2020
by   Josef Dick, et al.
0

We introduce an algorithm to reduce large data sets using so-called digital nets, which are well distributed point sets in the unit cube. These point sets together with weights, which depend on the data set, are used to represent the data. We show that this can be used to reduce the computational effort needed in finding good parameters in machine learning algorithms. To illustrate our method we provide some numerical examples for neural networks.

READ FULL TEXT
research
07/27/2022

Digital Nets and Sequences for Quasi-Monte Carlo Methods

Quasi-Monte Carlo methods are a way of improving the efficiency of Monte...
research
12/18/2020

A Tool for Custom Construction of QMC and RQMC Point Sets

We present LatNet Builder, a software tool to find good parameters for l...
research
10/23/2022

Symmetry and Variance: Generative Parametric Modelling of Historical Brick Wall Patterns

This study integrates artificial intelligence and computational design t...
research
02/17/2021

Multilevel Monte Carlo learning

In this work, we study the approximation of expected values of functiona...
research
07/17/2019

Multi-Scale Process Modelling and Distributed Computation for Spatial Data

Recent years have seen a huge development in spatial modelling and predi...
research
01/13/2022

Hyperparameter Importance for Machine Learning Algorithms

Hyperparameter plays an essential role in the fitting of supervised mach...
research
11/18/2019

Casimir effect with machine learning

Vacuum fluctuations of quantum fields between physical objects depend on...

Please sign up or login with your details

Forgot password? Click here to reset