Soft prompts have been recently proposed as a tool for adapting large fr...
In this paper, we explore the challenging problem of performing a genera...
As pre-trained language models have gotten larger, there has been growin...
Language agnostic and semantic-language information isolation is an emer...
We provide the first exploration of text-to-text transformers (T5) sente...
Numerical reasoning over text (NRoT) presents unique challenges that are...
This paper presents a novel training method, Conditional Masked Language...
We explore using T5 (Raffel et al. (2019)) to directly translate natural...
Neural models that independently project questions and answers into a sh...
We adapt multilingual BERT to produce language-agnostic sentence embeddi...
Retrieval question answering (ReQA) is the task of retrieving a
sentence...
Image captioning datasets have proven useful for multimodal representati...
Popular QA benchmarks like SQuAD have driven progress on the task of
ide...
We introduce two pre-trained retrieval focused multilingual sentence enc...
We explore using multilingual document embeddings for nearest neighbor m...
In this paper, we present an approach to learn multilingual sentence
emb...
Neural language models have been shown to achieve an impressive level of...
This paper presents an effective approach for parallel corpus mining usi...
We present a novel approach to learn representations for sentence-level
...
We present models for encoding sentences into embedding vectors that
spe...
Semantic Textual Similarity (STS) measures the meaning similarity of
sen...