Transferring model structure in Bayesian transfer learning for Gaussian process regression

01/18/2021
by   Milan Papež, et al.
18

Bayesian transfer learning (BTL) is defined in this paper as the task of conditioning a target probability distribution on a transferred source distribution. The target globally models the interaction between the source and target, and conditions on a probabilistic data predictor made available by an independent local source modeller. Fully probabilistic design is adopted to solve this optimal decision-making problem in the target. By successfully transferring higher moments of the source, the target can reject unreliable source knowledge (i.e. it achieves robust transfer). This dual-modeller framework means that the source's local processing of raw data into a transferred predictive distribution – with compressive possibilities – is enriched by (the possible expertise of) the local source model. In addition, the introduction of the global target modeller allows correlation between the source and target tasks – if known to the target – to be accounted for. Important consequences emerge. Firstly, the new scheme attains the performance of fully modelled (i.e. conventional) multitask learning schemes in (those rare) cases where target model misspecification is avoided. Secondly, and more importantly, the new dual-modeller framework is robust to the model misspecification that undermines conventional multitask learning. We thoroughly explore these issues in the key context of interacting Gaussian process regression tasks. Experimental evidence from both synthetic and real data settings validates our technical findings: that the proposed BTL framework enjoys robustness in transfer while also being robust to model misspecification.

READ FULL TEXT

page 1

page 6

page 8

research
09/22/2021

Fully probabilistic design for knowledge fusion between Bayesian filters under uniform disturbances

This paper considers the problem of Bayesian transfer learning-based kno...
research
07/20/2020

XMixup: Efficient Transfer Learning with Auxiliary Samples by Cross-domain Mixup

Transferring knowledge from large source datasets is an effective way to...
research
02/14/2019

Transfer Learning for Sequence Labeling Using Source Model and Target Data

In this paper, we propose an approach for transferring the knowledge of ...
research
06/09/2021

Probing transfer learning with a model of synthetic correlated datasets

Transfer learning can significantly improve the sample efficiency of neu...
research
08/11/2017

What matters in a transferable neural network model for relation classification in the biomedical domain?

Lack of sufficient labeled data often limits the applicability of advanc...
research
06/08/2021

Adaptive transfer learning

In transfer learning, we wish to make inference about a target populatio...
research
11/29/2022

Transfer Entropy Bottleneck: Learning Sequence to Sequence Information Transfer

When presented with a data stream of two statistically dependent variabl...

Please sign up or login with your details

Forgot password? Click here to reset