Hungarian Layer: Logics Empowered Neural Architecture

12/07/2017
by   Han Xiao, et al.
0

Neural architecture is a purely numeric framework, which fits the data as a continuous function. However, lacking of logic flow (e.g. if, for, while), traditional algorithms (e.g. Hungarian algorithm, A^* searching, decision tress algorithm) could not be embedded into this paradigm, which limits the theories and applications. In this paper, we reform the calculus graph as a dynamic process, which is guided by logic flow. Within our novel methodology, traditional algorithms could empower numerical neural network. Specifically, regarding the subject of sentence matching, we reformulate this issue as the form of task-assignment, which is solved by Hungarian algorithm. First, our model applies BiLSTM to parse the sentences. Then Hungarian layer aligns the matching positions. Last, we transform the matching results for soft-max regression by another BiLSTM. Extensive experiments show that our model outperforms other state-of-the-art baselines substantially.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset