Global Hierarchical Neural Networks using Hierarchical Softmax

08/02/2023
by   Jetze Schuurmans, et al.
0

This paper presents a framework in which hierarchical softmax is used to create a global hierarchical classifier. The approach is applicable for any classification task where there is a natural hierarchy among classes. We show empirical results on four text classification datasets. In all datasets the hierarchical softmax improved on the regular softmax used in a flat classifier in terms of macro-F1 and macro-recall. In three out of four datasets hierarchical softmax achieved a higher micro-accuracy and macro-precision.

READ FULL TEXT

page 8

page 9

research
12/13/2018

Effectiveness of Hierarchical Softmax in Large Scale Classification Tasks

Typically, Softmax is used in the final layer of a neural network to get...
research
12/21/2017

DropMax: Adaptive Stochastic Softmax

We propose DropMax, a stochastic version of softmax classifier which at ...
research
08/18/2019

TwistBytes -- Hierarchical Classification at GermEval 2019: walking the fine line (of recall and precision)

We present here our approach to the GermEval 2019 Task 1 - Shared Task o...
research
06/28/2013

Evaluation Measures for Hierarchical Classification: a unified view and novel approaches

Hierarchical classification addresses the problem of classifying items i...
research
09/17/2021

Hierarchy-Aware T5 with Path-Adaptive Mask Mechanism for Hierarchical Text Classification

Hierarchical Text Classification (HTC), which aims to predict text label...
research
10/25/2022

Revisiting Softmax for Uncertainty Approximation in Text Classification

Uncertainty approximation in text classification is an important area wi...
research
01/26/2019

Money on the Table: Statistical information ignored by Softmax can improve classifier accuracy

Softmax is a standard final layer used in Neural Nets (NNs) to summarize...

Please sign up or login with your details

Forgot password? Click here to reset