Decomposable Probability-of-Success Metrics in Algorithmic Search

01/03/2020
by   Tyler Sam, et al.
8

Previous studies have used a specific success metric within an algorithmic search framework to prove machine learning impossibility results. However, this specific success metric prevents us from applying these results on other forms of machine learning, e.g. transfer learning. We define decomposable metrics as a category of success metrics for search problems which can be expressed as a linear operation on a probability distribution to solve this issue. Using an arbitrary decomposable metric to measure the success of a search, we demonstrate theorems which bound success in various ways, generalizing several existing results in the literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/23/2020

Limits of Transfer Learning

Transfer learning involves taking information and insight from one probl...
research
12/20/2021

Calabi-Yau Metrics, Energy Functionals and Machine-Learning

We apply machine learning to the problem of finding numerical Calabi-Yau...
research
12/01/2019

Transferability versus Discriminability: Joint Probability Distribution Adaptation (JPDA)

Transfer learning makes use of data or knowledge in one task to help sol...
research
02/10/2023

On the Lattice of Program Metrics

In this paper we are concerned with understanding the nature of program ...
research
09/28/2016

The Famine of Forte: Few Search Problems Greatly Favor Your Algorithm

Casting machine learning as a type of search, we demonstrate that the pr...
research
02/28/2023

Metric Learning Improves the Ability of Combinatorial Coverage Metrics to Anticipate Classification Error

Machine learning models are increasingly used in practice. However, many...
research
09/04/2023

On the success probability of the quantum algorithm for the short DLP

Ekerå and Håstad have introduced a variation of Shor's algorithm for the...

Please sign up or login with your details

Forgot password? Click here to reset