An attentive neural architecture for joint segmentation and parsing and its application to real estate ads

09/27/2017
by   Giannis Bekoulis, et al.
0

In this paper we develop a relatively simple and effective neural joint model that performs both segmentation and dependency parsing. While the model arose in the context of a particular application, we believe the general idea could be translated to other settings where both (1) entities have to be identified from text, and (2) dependency relationships between them. The application we focus on here is the recently introduced problem of extracting the structured description, which we name property tree, of a real estate property based on the textual advertisement: convert an ad to a tree-structure indicating the buildings, floors, rooms, etc. and how one is part of another. Previous work on the problem focused on a hand-crafted feature-based pipeline approach comprising two different modules, solving the two subtasks of the structured prediction problem: (1) identify important entities of a property (e.g., rooms) from classifieds and (2) structure them into a tree format. In this work, we propose a new joint model that is able to tackle the two tasks simultaneously and construct the property tree by (i) avoiding the error propagation and (ii) exploiting the interactions between the subtasks. For this purpose, we perform an extensive comparative study of the pipeline methods and the new proposed joint model, reporting an improvement of over three percentage points in the overall edge F_1 score of the property tree. Also, we have considered several attention methods, to encourage our model to focus on salient tokens during the construction of the property tree. Thus we experimentally demonstrate the usefulness of attentive neural architectures for the proposed joint model, showcasing a further improvement of two percentage points in edge F_1 score for our application.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset