One of the goals in Federated Learning (FL) is to create personalized mo...
Generalization to unseen tasks is an important ability for few-shot lear...
Many approaches to Natural Language Processing (NLP) tasks often treat t...
Large-scale Pre-Trained Language Models (PTLMs) capture knowledge from
m...
Standard fine-tuning of large pre-trained language models (PLMs) for
dow...
Recent work has shown that language models (LMs) trained with multi-task...
Neural architecture search (NAS) has demonstrated promising results on
i...
Fine-tuning large-scale pre-trained language models to downstream tasks
...
Although adapting pre-trained language models with few examples has show...
Pre-trained language models (LMs) have been shown to memorize a substant...
Knowledge distillation (KD) methods compress large models into smaller
s...
This paper presents Okapi, a new dataset for Natural Language to executa...
Large-scale pre-trained language models have achieved tremendous success...
Most recent progress in natural language understanding (NLU) has been dr...
Gigantic pre-trained models have become central to natural language
proc...
Recent works have focused on compressing pre-trained language models (PL...
We present a new method LiST for efficient fine-tuning of large pre-trai...
We study the problem of multilingual automated reply suggestions (RS) mo...
Albeit the universal representational power of pre-trained language mode...
Recent advances in summarization provide models that can generate summar...
Building quality machine learning models for natural language understand...
Existing bias mitigation methods for DNN models primarily work on learni...
While deep and large pre-trained models are the state-of-the-art for var...
Reply suggestion models help users process emails and chats faster. Prev...
The combination of multilingual pre-trained representations and cross-li...
State-of-the-art deep neural networks require large-scale labeled traini...
We study semantic parsing in an interactive setting in which users corre...
Learning to capture text-table alignment is essential for table related ...
Neural sequence labeling is an important technique employed for many Nat...
Recent success of large-scale pre-trained language models crucially hing...
Software engineers spend a substantial amount of time using Web search t...
Email remains one of the most frequently used means of online communicat...
We study the task of semantic parse correction with natural language
fee...
Intelligent features in email service applications aim to increase
produ...
Multilingual representations embed words from many languages into a sing...
Internet plays a key role in accomplishing many tasks. For many such tas...
Leveraging weak or noisy supervision for building effective machine lear...
Limited labeled data is becoming the largest bottleneck for supervised
l...
Recent advances in pre-training huge models on large amounts of text thr...
Email triage involves going through unhandled emails and deciding what t...
Modern natural language processing and understanding applications have
e...
Emails in the workplace are often intentional calls to action for its
re...