Multi-armed bandit approach to password guessing

06/29/2020
by   Hazel Murray, et al.
0

The multi-armed bandit is a mathematical interpretation of the problem a gambler faces when confronted with a number of different machines (bandits). The gambler wants to explore different machines to discover which machine offers the best rewards, but simultaneously wants to exploit the most profitable machine. A password guesser is faced with a similar dilemma. They have lists of leaked password sets, dictionaries of words, and demographic information about the users, but they don't know which dictionary will reap the best rewards. In this paper we provide a framework for using the multi-armed bandit problem in the context of the password guesser and use some examples to show that it can be effective.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2022

Thompson Sampling on Asymmetric α-Stable Bandits

In algorithm optimization in reinforcement learning, how to deal with th...
research
11/22/2020

Applying Multi-armed Bandit Algorithms to Computational Advertising

Over the last two decades, we have seen extensive industrial research in...
research
07/30/2018

Preference-based Online Learning with Dueling Bandits: A Survey

In machine learning, the notion of multi-armed bandits refers to a class...
research
10/02/2018

Contextual Multi-Armed Bandits for Causal Marketing

This work explores the idea of a causal contextual multi-armed bandit ap...
research
10/04/2022

ProtoBandit: Efficient Prototype Selection via Multi-Armed Bandits

In this work, we propose a multi-armed bandit based framework for identi...
research
01/03/2023

Computing the Performance of A New Adaptive Sampling Algorithm Based on The Gittins Index in Experiments with Exponential Rewards

Designing experiments often requires balancing between learning about th...
research
08/17/2019

A Batched Multi-Armed Bandit Approach to News Headline Testing

Optimizing news headlines is important for publishers and media sites. A...

Please sign up or login with your details

Forgot password? Click here to reset