Flexible model composition in machine learning and its implementation in MLJ

12/31/2020
by   Anthony D. Blaom, et al.
0

A graph-based protocol called `learning networks' which combine assorted machine learning models into meta-models is described. Learning networks are shown to overcome several limitations of model composition as implemented in the dominant machine learning platforms. After illustrating the protocol in simple examples, a concise syntax for specifying a learning network, implemented in the MLJ framework, is presented. Using the syntax, it is shown that learning networks are are sufficiently flexible to include Wolpert's model stacking, with out-of-sample predictions for the base learners.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2020

MLJ: A Julia package for composable Machine Learning

MLJ (Machine Learing in Julia) is an open source software package provid...
research
05/16/2023

Flexible remote attestation of pre-SNP SEV VMs using SGX enclaves

We propose a protocol that explores a synergy between two TEE implementa...
research
09/18/2020

Probably Approximately Correct Explanations of Machine Learning Models via Syntax-Guided Synthesis

We propose a novel approach to understanding the decision making of comp...
research
02/12/2020

A Hierarchy of Limitations in Machine Learning

"All models are wrong, but some are useful", wrote George E. P. Box (197...
research
12/31/2020

Coded Machine Unlearning

Models trained in machine learning processes may store information about...
research
10/26/2020

Acquiring domain models

Whereas a Learning Apprentice System stresses the generation and refinem...
research
07/28/2021

A Reflection on Learning from Data: Epistemology Issues and Limitations

Although learning from data is effective and has achieved significant mi...

Please sign up or login with your details

Forgot password? Click here to reset