Bayesian Hierarchical Words Representation Learning

04/12/2020
by   Oren Barkan, et al.
7

This paper presents the Bayesian Hierarchical Words Representation (BHWR) learning algorithm. BHWR facilitates Variational Bayes word representation learning combined with semantic taxonomy modeling via hierarchical priors. By propagating relevant information between related words, BHWR utilizes the taxonomy to improve the quality of such representations. Evaluation of several linguistic datasets demonstrates the advantages of BHWR over suitable alternatives that facilitate Bayesian modeling with or without semantic priors. Finally, we further show that BHWR produces better representations for rare words.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2023

Representation Learning via Variational Bayesian Networks

We present Variational Bayesian Network (VBN) - a novel Bayesian entity ...
research
03/21/2017

Nonparametric Variational Auto-encoders for Hierarchical Representation Learning

The recently developed variational autoencoders (VAEs) have proved to be...
research
05/18/2018

Bayesian model reduction

This paper reviews recent developments in statistical structure learning...
research
11/19/2015

Joint Word Representation Learning using a Corpus and a Semantic Lexicon

Methods for learning word representations using large text corpora have ...
research
01/31/2019

Learning Taxonomies of Concepts and not Words using Contextualized Word Representations: A Position Paper

Taxonomies are semantic hierarchies of concepts. One limitation of curre...
research
06/12/2019

Representation Learning for Words and Entities

This thesis presents new methods for unsupervised learning of distribute...
research
06/08/2012

Fuzzy Knowledge Representation, Learning and Optimization with Bayesian Analysis in Fuzzy Semantic Networks

This paper presents a method of optimization, based on both Bayesian Ana...

Please sign up or login with your details

Forgot password? Click here to reset