Kernelized Bayesian Softmax for Text Generation

11/01/2019
by   Ning Miao, et al.
0

Neural models for text generation require a softmax layer with proper token embeddings during the decoding phase. Most existing approaches adopt single point embedding for each token. However, a word may have multiple senses according to different context, some of which might be distinct. In this paper, we propose KerBS, a novel approach for learning better embeddings for text generation. KerBS embodies two advantages: (a) it employs a Bayesian composition of embeddings for words with multiple senses; (b) it is adaptive to semantic variances of words and robust to rare sentence context by imposing learned kernels to capture the closeness of words (senses) in the embedding space. Empirical studies show that KerBS significantly boosts the performance of several text generation tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/01/2021

A Graph Total Variation Regularized Softmax for Text Generation

The softmax operator is one of the most important functions in machine l...
research
04/21/2019

BERTScore: Evaluating Text Generation with BERT

We propose BERTScore, an automatic evaluation metric for text generation...
research
11/12/2021

Speeding Up Entmax

Softmax is the de facto standard in modern neural networks for language ...
research
05/08/2021

Neural Text Generation with Part-of-Speech Guided Softmax

Neural text generation models are likely to suffer from the low-diversit...
research
06/09/2017

Learning to Embed Words in Context for Syntactic Tasks

We present models for embedding words in the context of surrounding word...
research
05/17/2020

MixingBoard: a Knowledgeable Stylized Integrated Text Generation Platform

We present MixingBoard, a platform for quickly building demos with a foc...
research
09/07/2023

Chasing Consistency in Text-to-3D Generation from a Single Image

Text-to-3D generation from a single-view image is a popular but challeng...

Please sign up or login with your details

Forgot password? Click here to reset