Privacy-preserving machine learning aims to train models on private data...
In this note (work in progress towards a full-length paper) we show that...
Recurrent Neural Networks (RNNs) offer fast inference on long sequences ...
The ability to generate privacy-preserving synthetic versions of sensiti...
Differential Privacy (DP) provides a formal privacy guarantee preventing...
Data augmentation is used in machine learning to make the classifier
inv...
One aim shared by multiple settings, such as continual learning or trans...
In computer vision, it is standard practice to draw a single sample from...
Batch normalization is a key component of most image classification mode...
For infinitesimal learning rates, stochastic gradient descent (SGD) foll...
Batch Normalization is a key component in almost all state-of-the-art im...
Bootstrap Your Own Latent (BYOL) is a self-supervised learning approach ...
It has long been argued that minibatch stochastic gradient descent can
g...
Batch normalization has multiple benefits. It improves the conditioning ...
Adversarial training is an effective methodology for training deep neura...
The goal of this paper is to study why stochastic gradient descent (SGD)...
RMSProp and ADAM continue to be extremely popular algorithms for trainin...
Humans interact with each other on a daily basis by developing and
maint...
Currently, deep neural networks are deployed on low-power portable devic...
Human societies around the world interact with each other by developing ...
Style transfer is an important task in which the style of a source image...
Classical stochastic gradient methods for optimization rely on noisy gra...
Stochastic Gradient Descent (SGD) has become one of the most popular
opt...
Variance reduction (VR) methods boost the performance of stochastic grad...
The increasing complexity of deep learning architectures is resulting in...
Given the large number of new musical tracks released each year, automat...