The pre-training and fine-tuning paradigm has contributed to a number of...
Knowledge Distillation (KD) is a prominent neural model compression tech...
Knowledge Distillation (KD) is a model compression algorithm that helps
...
While recent research on natural language inference has considerably
ben...
Task-oriented conversational modeling with unstructured knowledge access...
Disentanglement is a problem in which multiple conversations occur in th...
In this paper, we study the problem of employing pre-trained language mo...
The NOESIS II challenge, as the Track 2 of the 8th Dialogue System Techn...
Natural language inference (NLI) is among the most challenging tasks in
...