
On the Sample Complexity of Learning Bayesian Networks
In recent years there has been an increasing interest in learning Bayesi...
read it

Learning the mapping x∑_i=1^d x_i^2: the cost of finding the needle in a haystack
The task of using machine learning to approximate the mapping x∑_i=1^d x...
read it

How Many Samples are Needed to Learn a Convolutional Neural Network?
A widespread folklore for explaining the success of convolutional neural...
read it

Testing Preferential Domains Using Sampling
A preferential domain is a collection of sets of preferences which are l...
read it

On the Number of Samples Needed to Learn the Correct Structure of a Bayesian Network
Bayesian Networks (BNs) are useful tools giving a natural and compact re...
read it

Data Represention for Deep Learning with Priori Knowledge of Symmetric Wireless Tasks
Deep neural networks (DNNs) have been applied to address various wireles...
read it

Exponential Reduction in Sample Complexity with Learning of Ising Model Dynamics
The usual setting for learning the structure and parameters of a graphic...
read it
A Brief Study of InDomain Transfer and Learning from Fewer Samples using A Few Simple Priors
Domain knowledge can often be encoded in the structure of a network, such as convolutional layers for vision, which has been shown to increase generalization and decrease sample complexity, or the number of samples required for successful learning. In this study, we ask whether sample complexity can be reduced for systems where the structure of the domain is unknown beforehand, and the structure and parameters must both be learned from the data. We show that sample complexity reduction through learning structure is possible for at least two simple cases. In studying these cases, we also gain insight into how this might be done for more complex domains.
READ FULL TEXT
Comments
There are no comments yet.