Neural Comprehension: Language Models with Compiled Neural Networks

04/04/2023
by   Yixuan Weng, et al.
0

Language models have achieved impressive results in natural language processing tasks, but their ability to perform symbolic operations and arithmetic operations, remains limited, which attribute to their learn the rules implicitly from data. We explore how to incorporate compiled neural networks (CoNNs) which weight is specially designed, into the architecture of language models to enable the language model trained by gradient to obtain fully rule comprehension ability. The incorporation of compiled neural networks offers a promising direction for improving the performance of language models on compound tasks, particularly in areas that require a deeper comprehension of abstract rules beyond recognizing patterns in training data. Our method, which call "Neural Comprehension", helps language models achieve absolute accuracy in symbolic operations, thereby enhancing their ability for rule reasoning, symbolic reasoning, and arithmetic reasoning. Our code is publicly available at: <https://github.com/WENGSYX/Neural-Comprehension>.

READ FULL TEXT

page 19

page 20

page 24

page 31

page 32

research
08/29/2023

Large Language Models on the Chessboard: A Study on ChatGPT's Formal Language Comprehension and Complex Reasoning Skills

While large language models have made strides in natural language proces...
research
09/06/2023

GPT Can Solve Mathematical Problems Without a Calculator

Previous studies have typically assumed that large language models are u...
research
05/05/2023

MindGames: Targeting Theory of Mind in Large Language Models with Dynamic Epistemic Modal Logic

Theory of Mind (ToM) is a critical component of intelligence, yet accura...
research
07/11/2018

Measuring abstract reasoning in neural networks

Whether neural networks can learn abstract reasoning or whether they mer...
research
07/28/2022

Multi-Step Deductive Reasoning Over Natural Language: An Empirical Study on Out-of-Distribution Generalisation

Combining deep learning with symbolic logic reasoning aims to capitalize...
research
03/13/2022

Symbolic Learning to Optimize: Towards Interpretability and Scalability

Recent studies on Learning to Optimize (L2O) suggest a promising path to...
research
10/01/2021

Calibrating Concepts and Operations: Towards Symbolic Reasoning on Real Images

While neural symbolic methods demonstrate impressive performance in visu...

Please sign up or login with your details

Forgot password? Click here to reset