Large Content And Behavior Models To Understand, Simulate, And Optimize Content And Behavior

09/01/2023
by   Ashmit Khandelwal, et al.
0

Shannon, in his seminal paper introducing information theory, divided the communication into three levels: technical, semantic, and effectivenss. While the technical level is concerned with accurate reconstruction of transmitted symbols, the semantic and effectiveness levels deal with the inferred meaning and its effect on the receiver. Thanks to telecommunications, the first level problem has produced great advances like the internet. Large Language Models (LLMs) make some progress towards the second goal, but the third level still remains largely untouched. The third problem deals with predicting and optimizing communication for desired receiver behavior. LLMs, while showing wide generalization capabilities across a wide range of tasks, are unable to solve for this. One reason for the underperformance could be a lack of "behavior tokens" in LLMs' training corpora. Behavior tokens define receiver behavior over a communication, such as shares, likes, clicks, purchases, retweets, etc. While preprocessing data for LLM training, behavior tokens are often removed from the corpora as noise. Therefore, in this paper, we make some initial progress towards reintroducing behavior tokens in LLM training. The trained models, other than showing similar performance to LLMs on content understanding tasks, show generalization capabilities on behavior simulation, content simulation, behavior understanding, and behavior domain adaptation. Using a wide range of tasks on two corpora, we show results on all these capabilities. We call these models Large Content and Behavior Models (LCBMs). Further, to spur more research on LCBMs, we release our new Content Behavior Corpus (CBC), a repository containing communicator, message, and corresponding receiver behavior.

READ FULL TEXT

page 4

page 9

research
09/06/2023

HAE-RAE Bench: Evaluation of Korean Knowledge in Language Models

Large Language Models (LLMs) pretrained on massive corpora exhibit remar...
research
10/14/2021

Learning Semantics: An Opportunity for Effective 6G Communications

Recently, semantic communications are envisioned as a key enabler of fut...
research
08/24/2023

Code Llama: Open Foundation Models for Code

We release Code Llama, a family of large language models for code based ...
research
06/01/2023

The RefinedWeb Dataset for Falcon LLM: Outperforming Curated Corpora with Web Data, and Web Data Only

Large language models are commonly trained on a mixture of filtered web ...
research
08/04/2021

Emergent Discrete Communication in Semantic Spaces

Neural agents trained in reinforcement learning settings can learn to co...
research
01/29/2021

A new communication paradigm: from bit accuracy to semantic fidelity

Wireless communication has achieved great success in the past several de...

Please sign up or login with your details

Forgot password? Click here to reset