Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution

04/15/2021
by   Ryuto Konno, et al.
0

The use of pretrained masked language models (MLMs) has drastically improved the performance of zero anaphora resolution (ZAR). We further expand this approach with a novel pretraining task and finetuning method for Japanese ZAR. Our pretraining task aims to acquire anaphoric relational knowledge necessary for ZAR from a large-scale raw corpus. The ZAR model is finetuned in the same manner as pretraining. Our experiments show that combining the proposed methods surpasses previous state-of-the-art performance with large margins, providing insight on the remaining challenges.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset