DeepAI AI Chat
Log In Sign Up

Task Elimination may Actually Increase Throughput Time

12/28/2018
by   D. M. M. Schunselaar, et al.
TU Eindhoven
0

The well-known Task Elimination redesign principle suggests to remove unnecessary tasks from a process to improve on time and cost. Although there seems to be a general consensus that removing work can only improve the throughput time of the process, this paper shows that this is not necessarily the case by providing an example that uses plain M/M/c activities. This paper also shows that the Task Automation and Parallelism redesign principles may also lead to longer throughput times. Finally, apart from these negative results, the paper also show under which assumption these redesign principles indeed can only improve the throughput time.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/13/2016

Upper bound for effective differential elimination

We present an upper bound for the number of differentiations in differen...
10/13/2016

Bounds for elimination of unknowns in systems of differential-algebraic equations

Elimination of unknowns in systems of equations, starting with Gaussian ...
02/13/2013

Bucket Elimination: A Unifying Framework for Several Probabilistic Inference

Probabilistic inference algorithms for finding the most probable explana...
05/20/2019

Partial Redundancy Elimination using Lazy Code Motion

Partial Redundancy Elimination (PRE) is a compiler optimization that eli...
03/22/2019

Understanding and taming SSD read performance variability: HDFS case study

In this paper we analyze the influence that lower layers (file system, O...
09/12/2019

Flow: Separating Consensus and Compute – Execution Verification

Throughput limitations of existing blockchain architectures are well doc...
08/18/2018

Maximising Throughput in a Complex Coal Export System

The Port of Newcastle features three coal export terminals, operating pr...