On the contribution of pre-trained models to accuracy and utility in modeling distributed energy resources

02/22/2023
by   Hussain Kazmi, et al.
0

Despite their growing popularity, data-driven models of real-world dynamical systems require lots of data. However, due to sensing limitations as well as privacy concerns, this data is not always available, especially in domains such as energy. Pre-trained models using data gathered in similar contexts have shown enormous potential in addressing these concerns: they can improve predictive accuracy at a much lower observational data expense. Theoretically, due to the risk posed by negative transfer, this improvement is however neither uniform for all agents nor is it guaranteed. In this paper, using data from several distributed energy resources, we investigate and report preliminary findings on several key questions in this regard. First, we evaluate the improvement in predictive accuracy due to pre-trained models, both with and without fine-tuning. Subsequently, we consider the question of fairness: do pre-trained models create equal improvements for heterogeneous agents, and how does this translate to downstream utility? Answering these questions can help enable improvements in the creation, fine-tuning, and adoption of such pre-trained models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2020

On the Interplay Between Fine-tuning and Sentence-level Probing for Linguistic Knowledge in Pre-trained Transformers

Fine-tuning pre-trained contextualized embedding models has become an in...
research
09/14/2022

Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand Rare Biomedical Words

Prompt-based fine-tuning for pre-trained models has proven effective for...
research
03/10/2023

Generating Query Focused Summaries without Fine-tuning the Transformer-based Pre-trained Models

Fine-tuning the Natural Language Processing (NLP) models for each new da...
research
10/26/2022

Efficient Use of Large Pre-Trained Models for Low Resource ASR

Automatic speech recognition (ASR) has been established as a well-perfor...
research
07/14/2020

Adversarially-Trained Deep Nets Transfer Better

Transfer learning has emerged as a powerful methodology for adapting pre...
research
03/25/2022

Striking a Balance: Alleviating Inconsistency in Pre-trained Models for Symmetric Classification Tasks

While fine-tuning pre-trained models for downstream classification is th...
research
01/11/2023

Does progress on ImageNet transfer to real-world datasets?

Does progress on ImageNet transfer to real-world datasets? We investigat...

Please sign up or login with your details

Forgot password? Click here to reset