Large transformer-based language models, e.g. BERT and GPT-3, outperform...
Pre-trained large-scale language models such as BERT have gained a lot o...
Designers increasingly rely on procedural generation for automatic gener...
Recent advances in neural symbolic learning, such as DeepProbLog, extend...
Detecting if a text is humorous is a hard task to do computationally, as...
Natural language generation provides designers with methods for automati...
Pre-trained language models have been dominating the field of natural
la...
Automatically imitating input text is a common task in natural language
...
Talented public speakers have thousands of hours of practice. One means ...