The unprecedented performance of large language models (LLMs) necessitat...
Event extraction (EE) is a crucial task aiming at extracting events from...
This work examines the presence of modularity in pre-trained Transformer...
The robustness to distribution changes ensures that NLP models can be
su...
While there are abundant researches about evaluating ChatGPT on natural
...
Despite the recent emergence of video captioning models, how to generate...
How humans infer discrete emotions is a fundamental research question in...
For many real-world applications, the user-generated inputs usually cont...
Transformer-based pre-trained language models have demonstrated superior...
The diverse relationships among real-world events, including coreference...
Recognizing facts is the most fundamental step in making judgments, henc...
We propose a Doppler velocity-based cluster and velocity estimation algo...
Prompt tuning (PT) is a promising parameter-efficient method to utilize
...
How can pre-trained language models (PLMs) learn universal representatio...
Conventional tokenization methods for Chinese pretrained language models...
Event extraction (EE) has considerably benefited from pre-trained langua...
Pre-trained Language Models (PLMs) have proven to be beneficial for vari...
Event detection (ED), which identifies event trigger words and classifie...
Recently, pre-trained language models mostly follow the
pre-training-the...
Pre-trained language representation models (PLMs) learn effective langua...
While adversarial games have been well studied in various board games an...