Mixed Formal Learning: A Path to Transparent Machine Learning

01/20/2019
by   Sandra Carrico, et al.
0

This paper presents Mixed Formal Learning, a new architecture that learns models based on formal mathematical representations of the domain of interest and exposes latent variables. The second element in the architecture learns a particular skill, typically by using traditional prediction or classification mechanisms. Our key findings include that this architecture: (1) Facilitates transparency by exposing key latent variables based on a learned mathematical model; (2) Enables Low Shot and Zero Shot training of machine learning without sacrificing accuracy or recall.

READ FULL TEXT

page 1

page 2

page 3

research
09/10/2021

Rethinking Zero-shot Neural Machine Translation: From a Perspective of Latent Variables

Zero-shot translation, directly translating between language pairs unsee...
research
01/30/2020

Parameter Space Factorization for Zero-Shot Learning across Tasks and Languages

Most combinations of NLP tasks and language varieties lack in-domain exa...
research
12/24/2021

Learning Aligned Cross-Modal Representation for Generalized Zero-Shot Classification

Learning a common latent embedding by aligning the latent spaces of cros...
research
03/29/2016

Latent Embeddings for Zero-shot Classification

We present a novel latent embedding model for learning a compatibility f...
research
01/02/2020

Bayesian task embedding for few-shot Bayesian optimization

We describe a method for Bayesian optimization by which one may incorpor...
research
04/26/2019

Neural Ideal Point Estimation Network

Understanding politics is challenging because the politics take the infl...

Please sign up or login with your details

Forgot password? Click here to reset