We introduce a library, Dataset Grouper, to create large-scale
group-str...
We study stochastic optimization with linearly correlated noise. Our stu...
Federated learning (FL) is a general framework for learning across
heter...
Federated learning (FL) is a framework for machine learning across
heter...
Personalized federated learning considers learning models unique to each...
A significant bottleneck in federated learning is the network communicat...
We study iterated vector fields and investigate whether they are
conserv...
Federated learning methods typically learn a model by iteratively sampli...
The federated learning (FL) framework trains a machine learning model us...
We study a family of algorithms, which we refer to as local update metho...
We study a family of algorithms, which we refer to as local update metho...
Federated learning is a distributed machine learning paradigm in which a...
Federated learning (FL) is a machine learning setting where many clients...
Mini-batch stochastic gradient descent (SGD) approximates the gradient o...
To improve the resilience of distributed training to worst-case, or Byza...
Adversarial training is a technique for training robust machine learning...
Data augmentation (DA) is commonly used during model training, as it
sig...
We present ErasureHead, a new approach for distributed gradient descent ...
State-of-the-art machine learning models frequently misclassify inputs t...
Distributed model training suffers from communication overheads due to
f...
Gradient descent and its many variants, including mini-batch stochastic
...
Distributed model training is vulnerable to worst-case system failures a...
Distributed algorithms are often beset by the straggler effect, where th...
We establish novel generalization bounds for learning algorithms that
co...
Subspace clustering is the process of identifying a union of subspaces m...