Privacy-preserving machine learning aims to train models on private data...
In this note (work in progress towards a full-length paper) we show that...
Recurrent Neural Networks (RNNs) offer fast inference on long sequences ...
The ability to generate privacy-preserving synthetic versions of sensiti...
Skip connections and normalisation layers form two standard architectura...
Differential Privacy (DP) provides a formal privacy guarantee preventing...
In computer vision, it is standard practice to draw a single sample from...
Batch normalization is a key component of most image classification mode...
For infinitesimal learning rates, stochastic gradient descent (SGD) foll...
Batch Normalization is a key component in almost all state-of-the-art im...
Recent work has observed that one can outperform exact inference in Baye...
It has long been argued that minibatch stochastic gradient descent can
g...
Batch normalization has multiple benefits. It improves the conditioning ...
We investigate how the final parameters found by stochastic gradient des...
Natural gradient descent (NGD) minimises the cost function on a Riemanni...
Experimental evidence indicates that simple models outperform complex de...
It is common practice to decay the learning rate. Here we show one can
u...
This paper tackles two related questions at the heart of machine learnin...
Usually bilingual word vectors are trained "online". Mikolov et al. show...
Algorithms which sort lists of real numbers into ascending order have be...