Optimal Asynchronous Dynamic Policies in Energy-Efficient Data Centers
In this paper, we use a Markov decision process to find optimal asynchronous policy of an energy-efficient data center with two groups of heterogeneous servers, a finite buffer, and a fast setup process at sleep state. Servers in Group 1 always work. Servers in Group 2 may either work or sleep, and a fast setup process occurs when server's states are changed from sleep to work. In such a data center, an asynchronous dynamic policy is designed as two sub-policies: The setup policy and the sleep policy, which determine the switch rule between the work and sleep states for the servers in Group 2. To analyze the optimal asynchronous dynamic policy, we apply the Markov decision process to establish a policy-based Poisson equation, which provides expression for the unique solution of the performance potential by means of the RG-factorization. Based on this, we can characterize the monotonicity and optimality of the long-run average profit of the data center with respect to the asynchronous dynamic policy under different service prices. Furthermore, we prove that the bang-bang control is always optimal for this optimization problem, and supports a threshold-type dynamic control in the energy-efficient data center. We hope that the methodology and results derived in this paper can shed light to the study of more general energy-efficient data centers.
READ FULL TEXT