Contrastive learning has gained significant attention as a method for
se...
We present a framework for using transformer networks as universal compu...
Devising a fair classifier that does not discriminate against different
...
Fine-tuning pretrained language models (LMs) without making any architec...
Word translation without parallel corpora has become feasible, rivaling ...
Minimizing risk with fairness constraints is one of the popular approach...
It has been widely observed that large neural networks can be pruned to ...
Mixup is a data augmentation method that generates new data points by mi...
A recent work by Ramanujan et al. (2020) provides significant empirical
...
Federated learning has been spotlighted as a way to train neural network...
Due to its decentralized nature, Federated Learning (FL) lends itself to...
Recent advances in large-scale distributed learning algorithms have enab...
Coded distributed computing has been considered as a promising technique...
Coded computation is a framework which provides redundancy in distribute...
Coding for distributed computing supports low-latency computation by
rel...
Capacity of the distributed storage system (DSS) is often discussed in t...
Clustered distributed storage models real data centers where intra- and
...