DeepAI AI Chat
Log In Sign Up

Fragile Complexity of Comparison-Based Algorithms

by   Peyman Afshani, et al.

We initiate a study of algorithms with a focus on the computational complexity of individual elements, and introduce the fragile complexity of comparison-based algorithms as the maximal number of comparisons any individual element takes part in. We give a number of upper and lower bounds on the fragile complexity for fundamental problems, including Minimum, Selection, Sorting and Heap Construction. The results include both deterministic and randomized upper and lower bounds, and demonstrate a separation between the two settings for a number of problems. The depth of a comparator network is a straight-forward upper bound on the worst case fragile complexity of the corresponding fragile algorithm. We prove that fragile complexity is a different and strictly easier property than the depth of comparator networks, in the sense that for some problems a fragile complexity equal to the best network depth can be achieved with less total work and that with randomization, even a lower fragile complexity is possible.


page 1

page 2

page 3

page 4


New Lower Bounds for the Number of Pseudoline Arrangements

Arrangements of lines and pseudolines are fundamental objects in discret...

Finding a Mediocre Player

Consider a totally ordered set S of n elements; as an example, a set of ...

Strategies for Stable Merge Sorting

We introduce new stable, natural merge sort algorithms, called 2-merge s...

Contention Resolution with Predictions

In this paper, we consider contention resolution algorithms that are aug...

Lower Bounds for Shared-Memory Leader Election under Bounded Write Contention

This paper gives tight logarithmic lower bounds on the solo step complex...

On the Average Case of MergeInsertion

MergeInsertion, also known as the Ford-Johnson algorithm, is a sorting a...