Differentiable lower bound for expected BLEU score

12/13/2017
by   Vlad Zhukov, et al.
0

In natural language processing tasks performance of the models is often measured with some non-differentiable metric, such as BLEU score. To use efficient gradient-based methods for optimization, it is a common workaround to optimize some surrogate loss function. This approach is effective if optimization of such loss also results in improving target metric. The corresponding problem is referred to as loss-evaluation mismatch. In the present work we propose a method for calculation of differentiable lower bound of expected BLEU score that does not involve computationally expensive sampling procedure such as the one required when using REINFORCE rule from reinforcement learning (RL) framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2021

A surrogate loss function for optimization of F_β score in binary classification with imbalanced data

The F_β score is a commonly used measure of classification performance, ...
research
04/14/2017

Optimizing Differentiable Relaxations of Coreference Evaluation Metrics

Coreference evaluation metrics are hard to optimize directly as they are...
research
10/06/2021

Mismatched No More: Joint Model-Policy Optimization for Model-Based RL

Many model-based reinforcement learning (RL) methods follow a similar te...
research
05/15/2019

Addressing the Loss-Metric Mismatch with Adaptive Loss Alignment

In most machine learning training paradigms a fixed, often handcrafted, ...
research
12/10/2020

Automatic Standardization of Colloquial Persian

The Iranian Persian language has two varieties: standard and colloquial....
research
09/05/2023

Efficient Bayesian Computational Imaging with a Surrogate Score-Based Prior

We propose a surrogate function for efficient use of score-based priors ...
research
12/03/2021

Differentiable Scripting

In Computational Science, Engineering and Finance (CSEF) scripts typical...

Please sign up or login with your details

Forgot password? Click here to reset