数据中心在运转过程中需要消耗大量的电能。但是这其中的很大一部分电能都在工作负荷少时被空闲的服务器消耗了。动态供应技术通过在工作负荷少时,关掉不必要的服务器来节省这一部分的电能。在这篇文章中,我们研究未来工作负荷信息到底能给动态供应带来多大好处。特别地,对于有或没有未来工作负荷信息的两种情况,我们提出了在线的动态供应算法。我们首先发现了离线动态供应的最优解有一个很优美的结构,通过这个优美的结构我们可以以分而治之的方法完全确定离线动态供应的最优解。在这个基础之上我们设计了两个在线算法,它们的竞争比分别为2-α和e/(e - 1 + α),其中α表示标准化的预测未来窗口的长度。在这个预测未来窗口中,未来的工作负荷信息可以精确的获得。一个重要的发现是超出一个完整的预测未来窗口的未来工作负荷信息不会对动态供应的性能有任何提高。我们提出的在线算法是分散的,因此易于实现。最后,我们用真是数据中心的数据测试了我们的在线算法。 / 在设计在线算法的时候,我们利用了未来工作负荷信息。这是因为在很多的现代系统中,短期的未来工作信息可以被精确的估计。我们也测试了我们的算法在有预测噪声时候的性能,结果表明我们的算法在有噪声时,也能很好的工作。我们相信利用未来信息是设计在线算法的一个新的角度。在传统的在线算法设计过程中,我们通常不考虑未来输入信息。在这种情况下,许多在线问题有简单的最优的算法,但是这个最优算法的竞争比却很大。其实未来输入信息在很多在线问题中都能在一定程度上被精确预测,所以我们相信我们可以利用这些未来输入信息去设计竞争比较小的在线算法,这样设计的在线算法具有更多的应用优点,并在理论上也给予我们启发。 / Energy consumption represents a significant cost in data center operation. A large fraction of the energy however, is used to power idle servers when the workload is low. Dynamic provisioning techniques aim at saving this portion of the energy by turning of unnecessary servers. In this thesis we explore how much gain knowing future workload information can bring to dynamic pro-visioning. In particular we develop online dynamic provisioning solutions with and without future workload information available. We first reveal an elegant structure of the offline dynamic pro-visioning problem which allows us to characterize the optimal solution in a "divide-and-conquer" manner. We then exploit this insight to design two online algorithms with competitive ratios 2 - α and e/ (e - 1+ α), respectively where 0 ≤ α ≤ 1 is the normalized size of a look-ahead window in which future workload information is available. A fundamental observation is that future workload information beyond the full-size look-ahead window (corresponding to α =1) will not improve dynamic provisioning performance. Our algorithms are decentralized and easy to im-plement. We demonstrate their effectiveness in simulations using real-world traces. / When designing online algorithms, we utilize future input information because for many modern systems their short-term future inputs can be predicted by machine learning time-series analysis etc. We also test our algorithms in the presence of prediction errors in future workload information and the results show that our algorithms are robust to prediction errors. We believe that utilizing future information is a new and important degree of freedom in designing online algorithms. In traditional online algo¬rithm design future input information is not taken into account. Many online problems have online algorithms with optimal but large competitive ratios. Since future input information to some extent can be estimated accurately in many problems we believe that we should exploit such information in online algorithm design to achieve better competitive ratio and provide more competitive edge in both practice and theory. / Detailed summary in vernacular field only. / Detailed summary in vernacular field only. / Lu, Tan. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2012. / Includes bibliographical references (leaves 76-81). / Abstracts also in Chinese. / Abstract --- p.i / Acknowledgement --- p.iv / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Motivation --- p.1 / Chapter 1.2 --- Contributions --- p.4 / Chapter 1.3 --- Thesis Organization --- p.5 / Chapter 2 --- Related Work --- p.6 / Chapter 3 --- Problem Formulation --- p.10 / Chapter 3.1 --- Settings and Models --- p.10 / Chapter 3.2 --- Problem Formulation --- p.13 / Chapter 4 --- Optimal Solution and Offline Algorithm --- p.15 / Chapter 4.1 --- Structure of Optimal Solution --- p.15 / Chapter 4.2 --- Intuitions and Observations --- p.17 / Chapter 4.3 --- Offline Algorithm Achieving the Optimal Solution --- p.18 / Chapter 5 --- Online Dynamic Provisioning --- p.21 / Chapter 5.1 --- Dynamic Provisioning without FutureWorkload Information --- p.22 / Chapter 5.2 --- Dynamic Provisioning with Future Workload Information --- p.23 / Chapter 5.3 --- Adapting the Algorithms to Work with Discrete-Time Fluid Workload Model --- p.31 / Chapter 5.4 --- Extending to Case Where Servers Have Setup Time --- p.32 / Chapter 6 --- Experiments --- p.35 / Chapter 6.1 --- Settings --- p.35 / Chapter 6.2 --- Performance of the Proposed Online Algorithms --- p.38 / Chapter 6.3 --- Impact of Prediction Error --- p.39 / Chapter 6.4 --- Impact of Peak-to-Mean Ratio (PMR) --- p.40 / Chapter 6.5 --- Discussion --- p.40 / Chapter 6.6 --- Additional Experiments --- p.41 / Chapter 7 --- A New Degree of Freedom for Designing Online Algorithm --- p.44 / Chapter 7.1 --- The Lost Cow Problem --- p.45 / Chapter 7.2 --- Secretary Problem without Future Information --- p.47 / Chapter 7.3 --- Secretary Problem with Future Information --- p.48 / Chapter 7.4 --- Summary --- p.50 / Chapter 8 --- Conclusion --- p.51 / Chapter A --- Proof --- p.54 / Chapter A.1 --- Proof of Theorem 4.1.1 --- p.54 / Chapter A.2 --- Proof of Theorem 4.3.1 --- p.57 / Chapter A.3 --- Least idle vs last empty --- p.60 / Chapter A.4 --- Proof of Theorem 5.2.2 --- p.61 / Chapter A.5 --- Proof of Corollary 5.4.1 --- p.70 / Chapter A.6 --- Proof of Lemma 7.1.1 --- p.72 / Chapter A.7 --- Proof of Theorem 7.3.1 --- p.74 / Bibliography --- p.76
Identifer | oai:union.ndltd.org:cuhk.edu.hk/oai:cuhk-dr:cuhk_328729 |
Date | January 2012 |
Contributors | Lu, Tan., Chinese University of Hong Kong Graduate School. Division of Information Engineering. |
Source Sets | The Chinese University of Hong Kong |
Language | English, Chinese |
Detected Language | English |
Type | Text, bibliography |
Format | electronic resource, electronic resource, remote, 1 online resource (x, 81 leaves) : ill. (some col.) |
Rights | Use of this resource is governed by the terms and conditions of the Creative Commons “Attribution-NonCommercial-NoDerivatives 4.0 International” License (http://creativecommons.org/licenses/by-nc-nd/4.0/) |
Page generated in 0.0027 seconds