Fast post-hoc method for updating moments of large datasets
Moments of large datasets utilise the mean of the dataset; consequently, updating the dataset traditionally requires one to update the mean, which then requires one to recalculate the moment. This means that metrics such as the standard deviation, R^2 correlation, and other statistics have to be `refreshed' for dataset updates, requiring large data storage and taking long times to process. Here, a method is shown for updating moments that only requires the previous moments (which are computationally cheaper to store), and the new data to be appended. This leads to a dramatic decrease in data storage requirements, and significant computational speed-up for large datasets or low-order moments (n ≲ 10).
READ FULL TEXT