Exploiting Domain Knowledge via Grouped Weight Sharing with Application to Text Categorization

02/08/2017
by   Ye Zhang, et al.
0

A fundamental advantage of neural models for NLP is their ability to learn representations from scratch. However, in practice this often means ignoring existing external linguistic resources, e.g., WordNet or domain specific ontologies such as the Unified Medical Language System (UMLS). We propose a general, novel method for exploiting such resources via weight sharing. Prior work on weight sharing in neural networks has considered it largely as a means of model compression. In contrast, we treat weight sharing as a flexible mechanism for incorporating prior knowledge into neural models. We show that this approach consistently yields improved performance on classification tasks compared to baseline strategies that do not exploit weight sharing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/09/2016

Neural Generation of Regular Expressions from Natural Language with Minimal Domain Knowledge

This paper explores the task of translating natural language queries int...
research
10/16/2018

The Deep Weight Prior. Modeling a prior distribution for CNNs using generative models

Bayesian inference is known to provide a general framework for incorpora...
research
09/23/2019

Learning in the Machine: To Share or Not to Share?

Weight-sharing is one of the pillars behind Convolutional Neural Network...
research
06/01/2019

Cooperative neural networks (CoNN): Exploiting prior independence structure for improved classification

We propose a new approach, called cooperative neural networks (CoNN), wh...
research
06/08/2023

Mixture-of-Supernets: Improving Weight-Sharing Supernet Training with Architecture-Routed Mixture-of-Experts

Weight-sharing supernet has become a vital component for performance est...
research
07/13/2020

Lossless Compression of Structured Convolutional Models via Lifting

Lifting is an efficient technique to scale up graphical models generaliz...
research
03/29/2023

Polarity is all you need to learn and transfer faster

Natural intelligences (NIs) thrive in a dynamic world - they learn quick...

Please sign up or login with your details

Forgot password? Click here to reset