Data-Efficient Learning of Natural Language to Linear Temporal Logic Translators for Robot Task Specification

03/09/2023
by   Jiayi Pan, et al.
0

To make robots accessible to a broad audience, it is critical to endow them with the ability to take universal modes of communication, like commands given in natural language, and extract a concrete desired task specification, defined using a formal language like linear temporal logic (LTL). In this paper, we present a learning-based approach for translating from natural language commands to LTL specifications with very limited human-labeled training data. This is in stark contrast to existing natural-language to LTL translators, which require large human-labeled datasets, often in the form of labeled pairs of LTL formulas and natural language commands, to train the translator. To reduce reliance on human data, our approach generates a large synthetic training dataset through algorithmic generation of LTL formulas, conversion to structured English, and then exploiting the paraphrasing capabilities of modern large language models (LLMs) to synthesize a diverse corpus of natural language commands corresponding to the LTL formulas. We use this generated data to finetune an LLM and apply a constrained decoding procedure at inference time to ensure the returned LTL formula is syntactically correct. We evaluate our approach on three existing LTL/natural language datasets and show that we can translate natural language commands at 75% accuracy with far less human data (≤12 annotations). Moreover, when training on large human-annotated datasets, our method achieves higher test accuracy (95% on average) than prior work. Finally, we show the translated formulas can be used to plan long-horizon, multi-stage tasks on a 12D quadrotor.

READ FULL TEXT

page 1

page 5

page 6

research
05/13/2016

Natural Language Semantics and Computability

This paper is a reflexion on the computability of natural language seman...
research
05/23/2023

ChipGPT: How far are we from natural language hardware design

As large language models (LLMs) like ChatGPT exhibited unprecedented mac...
research
07/01/2022

Interactive Learning from Natural Language and Demonstrations using Signal Temporal Logic

Natural language is an intuitive way for humans to communicate tasks to ...
research
06/04/2022

Formal Specifications from Natural Language

We study the generalization abilities of language models when translatin...
research
11/13/2021

Extracting and filtering paraphrases by bridging natural language inference and paraphrasing

Paraphrasing is a useful natural language processing task that can contr...
research
08/20/2023

Activation Addition: Steering Language Models Without Optimization

Reliably controlling the behavior of large language models (LLMs) is a p...
research
12/20/2022

Toward Human Readable Prompt Tuning: Kubrick's The Shining is a good movie, and a good prompt too?

Large language models can perform new tasks in a zero-shot fashion, give...

Please sign up or login with your details

Forgot password? Click here to reset