Information-theoretic approaches to active learning have traditionally
f...
The paper 'Deep Learning on a Data Diet' by Paul et al. (2021) introduce...
Batch active learning is a popular approach for efficiently training mac...
Active learning is a powerful method for training machine learning model...
The mutual information between predictions and model parameters – also
r...
Training on web-scale data can take months. But most computation and tim...
Principled Bayesian deep learning (BDL) does not live up to its potentia...
Jiang et al. (2021) give empirical evidence that the average test error ...
Estimating personalized treatment effects from high-dimensional observat...
We introduce Goldilocks Selection, a technique for faster model training...
Information theory is of importance to machine learning, but the notatio...
In active learning, new labels are commonly acquired in batches. However...
Active Learning is essential for more label-efficient deep learning. Bay...
We show that a single softmax neural net with minimal changes can beat t...
We develop BatchEvaluationBALD, a new acquisition function for deep Baye...
The information bottleneck (IB) principle offers both a mechanism to exp...
We develop BatchBALD, a tractable approximation to the mutual informatio...