Autoregressive language models (LMs) map token sequences to probabilitie...
Pre-trained language models and other generative models have revolutioni...
Aligning language models with preferences can be posed as approximating ...
The availability of large pre-trained models is changing the landscape o...
Energy-Based Models (EBMs) allow for extremely flexible specifications o...
Machine learning is shifting towards general-purpose pretrained generati...
The power of natural language generation models has provoked a flurry of...
Neural language models can be successfully trained on source code, leadi...
Researching the conditions for the emergence of life – not necessarily a...
Continual Learning has been often framed as the problem of training a mo...
Researching the conditions for the emergence of life – not necessarily a...
Recent work has shown that LSTMs trained on a generic language modeling
...
There has been considerable attention devoted to models that learn to jo...
Learning to follow human instructions is a challenging task because whil...
In this paper, we introduce Attentive Guidance (AG), a new mechanism to
...
Although much effort has recently been devoted to training high-quality
...
Neural networks are very powerful learning systems, but they do not read...
With machine learning successfully applied to new daunting problems almo...
We introduce LAMBADA, a dataset to evaluate the capabilities of computat...