Fine-tuning Tree-LSTM for phrase-level sentiment classification on a Polish dependency treebank. Submission to PolEval task 2

11/03/2017
by   Tomasz Korbak, et al.
0

We describe a variant of Child-Sum Tree-LSTM deep neural network (Tai et al, 2015) fine-tuned for working with dependency trees and morphologically rich languages using the example of Polish. Fine-tuning included applying a custom regularization technique (zoneout, described by (Krueger et al., 2016), and further adapted for Tree-LSTMs) as well as using pre-trained word embeddings enhanced with sub-word information (Bojanowski et al., 2016). The system was implemented in PyTorch and evaluated on phrase-level sentiment labeling task as part of the PolEval competition.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2020

On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines

Fine-tuning pre-trained transformer-based language models such as BERT h...
research
05/22/2018

Learning sentence embeddings using Recursive Networks

Learning sentence vectors that generalise well is a challenging task. In...
research
08/24/2017

A dependency look at the reality of constituency

A comment on "Neurophysiological dynamics of phrase-structure building d...
research
10/28/2021

ÚFAL at MultiLexNorm 2021: Improving Multilingual Lexical Normalization by Fine-tuning ByT5

We present the winning entry to the Multilingual Lexical Normalization (...
research
05/31/2021

On the Interplay Between Fine-tuning and Composition in Transformers

Pre-trained transformer language models have shown remarkable performanc...
research
07/29/2021

Term Expansion and FinBERT fine-tuning for Hypernym and Synonym Ranking of Financial Terms

Hypernym and synonym matching are one of the mainstream Natural Language...
research
04/30/2017

Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings

We consider the problem of learning general-purpose, paraphrastic senten...

Please sign up or login with your details

Forgot password? Click here to reset