Leveraging pre-trained language models for conversational information seeking from text

03/31/2022
by   Patrizio Bellan, et al.
0

Recent advances in Natural Language Processing, and in particular on the construction of very large pre-trained language representation models, is opening up new perspectives on the construction of conversational information seeking (CIS) systems. In this paper we investigate the usage of in-context learning and pre-trained language representation models to address the problem of information extraction from process description documents, in an incremental question and answering oriented fashion. In particular we investigate the usage of the native GPT-3 (Generative Pre-trained Transformer 3) model, together with two in-context learning customizations that inject conceptual definitions and a limited number of samples in a few shot-learning fashion. The results highlight the potential of the approach and the usefulness of the in-context learning customizations, which can substantially contribute to address the "training data challenge" of deep learning based NLP techniques the BPM field. It also highlight the challenge posed by control flow relations for which further training needs to be devised.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2022

Leveraging Pre-trained Models for Failure Analysis Triplets Generation

Pre-trained Language Models recently gained traction in the Natural Lang...
research
09/18/2019

Pre-trained Language Model for Biomedical Question Answering

The recent success of question answering systems is largely attributed t...
research
04/18/2022

A Study on Prompt-based Few-Shot Learning Methods for Belief State Tracking in Task-oriented Dialog Systems

We tackle the Dialogue Belief State Tracking(DST) problem of task-orient...
research
02/23/2023

Dr ChatGPT, tell me what I want to hear: How prompt knowledge impacts health answer correctness

Generative pre-trained language models (GPLMs) like ChatGPT encode in th...
research
02/16/2023

Foundation Models for Natural Language Processing – Pre-trained Language Models Integrating Media

This open access book provides a comprehensive overview of the state of ...
research
03/08/2023

disco: a toolkit for Distributional Control of Generative Models

Pre-trained language models and other generative models have revolutioni...
research
09/14/2021

Exploring Prompt-based Few-shot Learning for Grounded Dialog Generation

Dialog grounding enables conversational models to make full use of exter...

Please sign up or login with your details

Forgot password? Click here to reset