Neuro-symbolic Meta Reinforcement Learning for Trading

01/15/2023
by   S I Harini, et al.
0

We model short-duration (e.g. day) trading in financial markets as a sequential decision-making problem under uncertainty, with the added complication of continual concept-drift. We, therefore, employ meta reinforcement learning via the RL2 algorithm. It is also known that human traders often rely on frequently occurring symbolic patterns in price series. We employ logical program induction to discover symbolic patterns that occur frequently as well as recently, and explore whether using such features improves the performance of our meta reinforcement learning algorithm. We report experiments on real data indicating that meta-RL is better than vanilla RL and also benefits from learned symbolic features.

READ FULL TEXT
research
08/24/2022

A model-based approach to meta-Reinforcement Learning: Transformers and tree search

Meta-learning is a line of research that develops the ability to leverag...
research
09/28/2021

Reinforcement Learning for Quantitative Trading

Quantitative trading (QT), which refers to the usage of mathematical mod...
research
10/09/2019

Improving Generalization in Meta Reinforcement Learning using Learned Objectives

Biological evolution has distilled the experiences of many learners into...
research
03/31/2019

Cooperative Multi-Agent Reinforcement Learning Framework for Scalping Trading

We explore deep Reinforcement Learning(RL) algorithms for scalping tradi...
research
04/29/2021

What is Going on Inside Recurrent Meta Reinforcement Learning Agents?

Recurrent meta reinforcement learning (meta-RL) agents are agents that e...
research
11/09/2020

Reinforced Deep Markov Models With Applications in Automatic Trading

Inspired by the developments in deep generative models, we propose a mod...
research
01/20/2021

mt5b3: A Framework for Building AutonomousTraders

Autonomous trading robots have been studied in ar-tificial intelligence ...

Please sign up or login with your details

Forgot password? Click here to reset