In-context Example Selection with Influences

02/21/2023
by   Tai Nguyen, et al.
0

In-context learning (ICL) is a powerful paradigm emerged from large language models (LLMs). Despite its promises, ICL performance is known to be highly sensitive to input examples. In this work, we use in-context influences to analyze few-shot ICL performance directly from the in-context examples. Our proposed influence-based example selection method outperforms most baselines when evaluated on 10 SuperGlue tasks and stably scales with increasing k-shot. The analysis finds up to a 22.2 and negatively influential examples. In a case study, we apply our influence-based framework to quantify the phenomena of recency bias in example ordering for few-shot ICL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2023

In-context Example Selection for Machine Translation Using Multiple Features

Large language models have demonstrated the capability to perform well o...
research
05/23/2023

Skill-Based Few-Shot Selection for In-Context Learning

In-Context learning is the paradigm that adapts large language models to...
research
05/05/2020

Interpreting Deep Models through the Lens of Data

Identification of input data points relevant for the classifier (i.e. se...
research
07/27/2023

Metric-Based In-context Learning: A Case Study in Text Simplification

In-context learning (ICL) for large language models has proven to be a p...
research
12/20/2022

Self-adaptive In-context Learning

Despite the surprising few-shot performance of in-context learning (ICL)...
research
07/22/2018

Mutual Influences in Interwoven Systems and their detection in the context of Organic Computing

Technical systems have evolved over time into large and complex Interwov...
research
01/31/2023

What Makes Good Examples for Visual In-Context Learning?

Large-scale models trained on broad data have recently become the mainst...

Please sign up or login with your details

Forgot password? Click here to reset