Representative period selection for power system planning using autoencoder-based dimensionality reduction

04/28/2022
by   Marc Barbar, et al.
0

Power sector capacity expansion models (CEMs) that are used for studying future low-carbon grid scenarios must incorporate detailed representation of grid operations. Often CEMs are formulated to model grid operations over representative periods that are sampled from the original input data using clustering algorithms. However, such representative period selection (RPS) methods are limited by the declining efficacy of the clustering algorithm with increasing dimensionality of the input data and do not consider the relative importance of input data variations on CEM outcomes. Here, we propose a RPS method that addresses these limitations by incorporating dimensionality reduction, accomplished via neural network based autoencoders, prior to clustering. Such dimensionality reduction not only improves the performance of the clustering algorithm, but also facilitates using additional features, such as estimated outputs produced from parallel solutions of simplified versions of the CEM for each disjoint period in the input data (e.g. 1 week). The impact of incorporating dimensionality reduction as part of RPS methods is quantified through the error in outcomes of the corresponding reduced-space CEM vs. the full space CEM. Extensive numerical experimentation across various networks and range of technology and policy scenarios establish the superiority of the dimensionality-reduction based RPS methods.

READ FULL TEXT

page 1

page 6

page 7

page 8

page 9

research
03/05/2018

Deep Continuous Clustering

Clustering high-dimensional datasets is hard because interpoint distance...
research
05/06/2018

Branching embedding: A heuristic dimensionality reduction algorithm based on hierarchical clustering

This paper proposes a new dimensionality reduction algorithm named branc...
research
06/10/2022

Hierarchical mixtures of Gaussians for combined dimensionality reduction and clustering

To avoid the curse of dimensionality, a common approach to clustering hi...
research
03/06/2020

BasisVAE: Translation-invariant feature-level clustering with Variational Autoencoders

Variational Autoencoders (VAEs) provide a flexible and scalable framewor...
research
12/22/2019

Interpretable Embeddings From Molecular Simulations Using Gaussian Mixture Variational Autoencoders

Extracting insight from the enormous quantity of data generated from mol...
research
02/09/2022

Reducing Redundancy in the Bottleneck Representation of the Autoencoders

Autoencoders are a type of unsupervised neural networks, which can be us...
research
11/14/2018

Unsupervised learning with contrastive latent variable models

In unsupervised learning, dimensionality reduction is an important tool ...

Please sign up or login with your details

Forgot password? Click here to reset