Flexible model composition in machine learning and its implementation in MLJ

12/31/2020
by   Anthony D. Blaom, et al.
0

A graph-based protocol called `learning networks' which combine assorted machine learning models into meta-models is described. Learning networks are shown to overcome several limitations of model composition as implemented in the dominant machine learning platforms. After illustrating the protocol in simple examples, a concise syntax for specifying a learning network, implemented in the MLJ framework, is presented. Using the syntax, it is shown that learning networks are are sufficiently flexible to include Wolpert's model stacking, with out-of-sample predictions for the base learners.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset