DeepAI AI Chat
Log In Sign Up

Transferring model structure in Bayesian transfer learning for Gaussian process regression

by   Milan Papež, et al.
Trinity College Dublin
Akademie věd ČR

Bayesian transfer learning (BTL) is defined in this paper as the task of conditioning a target probability distribution on a transferred source distribution. The target globally models the interaction between the source and target, and conditions on a probabilistic data predictor made available by an independent local source modeller. Fully probabilistic design is adopted to solve this optimal decision-making problem in the target. By successfully transferring higher moments of the source, the target can reject unreliable source knowledge (i.e. it achieves robust transfer). This dual-modeller framework means that the source's local processing of raw data into a transferred predictive distribution – with compressive possibilities – is enriched by (the possible expertise of) the local source model. In addition, the introduction of the global target modeller allows correlation between the source and target tasks – if known to the target – to be accounted for. Important consequences emerge. Firstly, the new scheme attains the performance of fully modelled (i.e. conventional) multitask learning schemes in (those rare) cases where target model misspecification is avoided. Secondly, and more importantly, the new dual-modeller framework is robust to the model misspecification that undermines conventional multitask learning. We thoroughly explore these issues in the key context of interacting Gaussian process regression tasks. Experimental evidence from both synthetic and real data settings validates our technical findings: that the proposed BTL framework enjoys robustness in transfer while also being robust to model misspecification.


page 1

page 6

page 8


Fully probabilistic design for knowledge fusion between Bayesian filters under uniform disturbances

This paper considers the problem of Bayesian transfer learning-based kno...

XMixup: Efficient Transfer Learning with Auxiliary Samples by Cross-domain Mixup

Transferring knowledge from large source datasets is an effective way to...

Transfer Learning for Sequence Labeling Using Source Model and Target Data

In this paper, we propose an approach for transferring the knowledge of ...

Probing transfer learning with a model of synthetic correlated datasets

Transfer learning can significantly improve the sample efficiency of neu...

What matters in a transferable neural network model for relation classification in the biomedical domain?

Lack of sufficient labeled data often limits the applicability of advanc...

Adaptive transfer learning

In transfer learning, we wish to make inference about a target populatio...

Transfer Entropy Bottleneck: Learning Sequence to Sequence Information Transfer

When presented with a data stream of two statistically dependent variabl...