Optimal Sensing via Multi-armed Bandit Relaxations in Mixed Observability Domains

03/15/2016
by   Mikko Lauri, et al.
0

Sequential decision making under uncertainty is studied in a mixed observability domain. The goal is to maximize the amount of information obtained on a partially observable stochastic process under constraints imposed by a fully observable internal state. An upper bound for the optimal value function is derived by relaxing constraints. We identify conditions under which the relaxed problem is a multi-armed bandit whose optimal policy is easily computable. The upper bound is applied to prune the search space in the original problem, and the effect on solution quality is assessed via simulation experiments. Empirical results show effective pruning of the search space in a target monitoring domain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2019

Multi-Armed Bandit Problem and Batch UCB Rule

We obtain the upper bound of the loss function for a strategy in the mul...
research
04/18/2019

Sequential Decision Making under Uncertainty with Dynamic Resource Constraints

This paper studies a class of constrained restless multi-armed bandits. ...
research
04/18/2021

Monte Carlo Elites: Quality-Diversity Selection as a Multi-Armed Bandit Problem

A core challenge of evolutionary search is the need to balance between e...
research
12/01/2017

Novel Exploration Techniques (NETs) for Malaria Policy Interventions

The task of decision-making under uncertainty is daunting, especially fo...
research
12/06/2022

An Index Policy for Minimizing the Uncertainty-of-Information of Markov Sources

This paper focuses on the information freshness of finite-state Markov s...
research
06/22/2023

Pure Exploration in Bandits with Linear Constraints

We address the problem of identifying the optimal policy with a fixed co...
research
08/21/2019

Exploring Offline Policy Evaluation for the Continuous-Armed Bandit Problem

The (contextual) multi-armed bandit problem (MAB) provides a formalizati...

Please sign up or login with your details

Forgot password? Click here to reset