DeepAI AI Chat
Log In Sign Up

Task Elimination may Actually Increase Throughput Time

by   D. M. M. Schunselaar, et al.
TU Eindhoven

The well-known Task Elimination redesign principle suggests to remove unnecessary tasks from a process to improve on time and cost. Although there seems to be a general consensus that removing work can only improve the throughput time of the process, this paper shows that this is not necessarily the case by providing an example that uses plain M/M/c activities. This paper also shows that the Task Automation and Parallelism redesign principles may also lead to longer throughput times. Finally, apart from these negative results, the paper also show under which assumption these redesign principles indeed can only improve the throughput time.


page 1

page 2

page 3

page 4


Upper bound for effective differential elimination

We present an upper bound for the number of differentiations in differen...

Bounds for elimination of unknowns in systems of differential-algebraic equations

Elimination of unknowns in systems of equations, starting with Gaussian ...

Bucket Elimination: A Unifying Framework for Several Probabilistic Inference

Probabilistic inference algorithms for finding the most probable explana...

Partial Redundancy Elimination using Lazy Code Motion

Partial Redundancy Elimination (PRE) is a compiler optimization that eli...

Understanding and taming SSD read performance variability: HDFS case study

In this paper we analyze the influence that lower layers (file system, O...

Flow: Separating Consensus and Compute – Execution Verification

Throughput limitations of existing blockchain architectures are well doc...

Maximising Throughput in a Complex Coal Export System

The Port of Newcastle features three coal export terminals, operating pr...