Transformers have recently become very popular for sequence-to-sequence
...
Transformer-based models have recently become very popular for
sequence-...
Recently, self-supervised pre-training has shown significant improvement...
In this work, we propose a new pooling strategy for language identificat...