Complex Structure Leads to Overfitting: A Structure Regularization Decoding Method for Natural Language Processing

11/25/2017
by   Xu Sun, et al.
0

Recent systems on structured prediction focus on increasing the level of structural dependencies within the model. However, our study suggests that complex structures entail high overfitting risks. To control the structure-based overfitting, we propose to conduct structure regularization decoding (SR decoding). The decoding of the complex structure model is regularized by the additionally trained simple structure model. We theoretically analyze the quantitative relations between the structural complexity and the overfitting risk. The analysis shows that complex structure models are prone to the structure-based overfitting. Empirical evaluations show that the proposed method improves the performance of the complex structure models by reducing the structure-based overfitting. On the sequence labeling tasks, the proposed method substantially improves the performance of the complex neural network models. The maximum F1 error rate reduction is 36.4 the third-order model. The proposed method also works for the parsing task. The maximum UAS improvement is 5.5 competitive with or better than the state-of-the-art results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset