ComputeGPT: A computational chat model for numerical problems

05/08/2023
by   Ryan Hardesty Lewis, et al.
0

Language models are not accurate in numerical problems. Their architecture does not allow for anything less than a probabilistic next word. This paper introduces ComputeGPT: an approach of creating a chat model able to answer computational problems through running on-demand code. ComputeGPT converts each question to relevant code, runs the code, and returns the computed answer as part of the chat. We combine this approach with a local browser-based Python interpretation and fine-tuned prompts in order to achieve state-of-the-art efficiency on numerical problems and provide a suitable front-end and safe environment for the code to be executed in.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2023

Knowledge Transfer from High-Resource to Low-Resource Programming Languages for Code LLMs

Over the past few years, Large Language Models of Code (Code LLMs) have ...
research
05/09/2023

StarCoder: may the source be with you!

The BigCode community, an open-scientific collaboration working on the r...
research
05/27/2023

A Practical Toolkit for Multilingual Question and Answer Generation

Generating questions along with associated answers from a text has appli...
research
09/10/2021

PICARD: Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models

Large pre-trained language models for textual data have an unconstrained...
research
02/02/2023

Creating a Large Language Model of a Philosopher

Can large language models be trained to produce philosophical texts that...
research
05/24/2023

Extracting Psychological Indicators Using Question Answering

In this work, we propose a method for extracting text spans that may ind...

Please sign up or login with your details

Forgot password? Click here to reset