Large language models produce human-like text that drive a growing numbe...
We investigate the optimal model size and number of tokens for training ...
Programming is a powerful and ubiquitous problem-solving tool. Developin...
Large language models (LM) generate remarkably fluent text and can be
ef...
The Apperception Engine is an unsupervised learning system. Given a sequ...
Current reading comprehension models generalise well to in-distribution ...
Innovations in annotation methodology have been a propellant for Reading...
Recent improvements in large-scale language models have driven progress ...
This paper attempts to answer a central question in unsupervised learnin...
Neural networks are part of many contemporary NLP systems, yet their
emp...
Many Machine Reading and Natural Language Understanding tasks require re...
Most Reading Comprehension methods limit themselves to queries which can...
We present a novel method for obtaining high-quality, domain-targeted
mu...
In statistical relational learning, knowledge graph completion deals wit...
Neural language models predict the next token using a latent representat...
In statistical relational learning, the link prediction problem is key t...
Given an ensemble of randomized regression trees, it is possible to
rest...
Embedding-based Knowledge Base Completion models have so far mostly comb...