1 |
Towards Sustainable Cloud Computing: Reducing Electricity Cost and Carbon Footprint for Cloud Data Centers through Geographical and Temporal Shifting of WorkloadsLe, Trung 17 July 2012 (has links)
Cloud Computing presents a novel way for businesses to procure their IT needs. Its elasticity and on-demand provisioning enables a shift from capital expenditures to operating expenses, giving businesses the technological agility they need to respond to an ever-changing marketplace. The rapid adoption of Cloud Computing, however, poses a unique challenge to Cloud providers—their already very large electricity bill and carbon footprint will get larger as they expand; managing both costs is therefore essential to their growth.
This thesis squarely addresses the above challenge. Recognizing the presence of Cloud data centers in multiple locations and the differences in electricity price and emission intensity among these locations and over time, we develop an optimization framework that couples workload distribution with time-varying signals on electricity price and emission intensity for financial and environmental benefits. The framework is comprised of an optimization model, an aggregate cost function, and 6 scheduling heuristics.
To evaluate cost savings, we run simulations with 5 data centers located across North America over a period of 81 days. We use historical data on electricity price, emission intensity, and workload collected from market operators and research data archives. We find that our framework can produce substantial cost savings, especially when workloads are distributed both geographically and temporally—up to 53.35% on electricity cost, or 29.13% on carbon cost, or 51.44% on electricity cost and 13.14% on carbon cost simultaneously.
|
2 |
Towards Sustainable Cloud Computing: Reducing Electricity Cost and Carbon Footprint for Cloud Data Centers through Geographical and Temporal Shifting of WorkloadsLe, Trung 17 July 2012 (has links)
Cloud Computing presents a novel way for businesses to procure their IT needs. Its elasticity and on-demand provisioning enables a shift from capital expenditures to operating expenses, giving businesses the technological agility they need to respond to an ever-changing marketplace. The rapid adoption of Cloud Computing, however, poses a unique challenge to Cloud providers—their already very large electricity bill and carbon footprint will get larger as they expand; managing both costs is therefore essential to their growth.
This thesis squarely addresses the above challenge. Recognizing the presence of Cloud data centers in multiple locations and the differences in electricity price and emission intensity among these locations and over time, we develop an optimization framework that couples workload distribution with time-varying signals on electricity price and emission intensity for financial and environmental benefits. The framework is comprised of an optimization model, an aggregate cost function, and 6 scheduling heuristics.
To evaluate cost savings, we run simulations with 5 data centers located across North America over a period of 81 days. We use historical data on electricity price, emission intensity, and workload collected from market operators and research data archives. We find that our framework can produce substantial cost savings, especially when workloads are distributed both geographically and temporally—up to 53.35% on electricity cost, or 29.13% on carbon cost, or 51.44% on electricity cost and 13.14% on carbon cost simultaneously.
|
3 |
Towards Sustainable Cloud Computing: Reducing Electricity Cost and Carbon Footprint for Cloud Data Centers through Geographical and Temporal Shifting of WorkloadsLe, Trung January 2012 (has links)
Cloud Computing presents a novel way for businesses to procure their IT needs. Its elasticity and on-demand provisioning enables a shift from capital expenditures to operating expenses, giving businesses the technological agility they need to respond to an ever-changing marketplace. The rapid adoption of Cloud Computing, however, poses a unique challenge to Cloud providers—their already very large electricity bill and carbon footprint will get larger as they expand; managing both costs is therefore essential to their growth.
This thesis squarely addresses the above challenge. Recognizing the presence of Cloud data centers in multiple locations and the differences in electricity price and emission intensity among these locations and over time, we develop an optimization framework that couples workload distribution with time-varying signals on electricity price and emission intensity for financial and environmental benefits. The framework is comprised of an optimization model, an aggregate cost function, and 6 scheduling heuristics.
To evaluate cost savings, we run simulations with 5 data centers located across North America over a period of 81 days. We use historical data on electricity price, emission intensity, and workload collected from market operators and research data archives. We find that our framework can produce substantial cost savings, especially when workloads are distributed both geographically and temporally—up to 53.35% on electricity cost, or 29.13% on carbon cost, or 51.44% on electricity cost and 13.14% on carbon cost simultaneously.
|
4 |
Vem ska göra vad? Uppdatering av arbetsfördelning : Förbättringsarbete på AB GotlandsHem / What task should be assigned to whom? Updating workload distributionLisper, Josefin, Aman Kylegård, Oscar January 2023 (has links)
Abstract Introduction: The case study has been conducted at AB GotlandsHem, the largest housing agency on the island of Gotland, Sweden. The study has two purposes. The first purpose is to map and examine the workload distribution in the inspection department by developing a generalizable template. The second purpose is to investigate how AB GotlandsHem can proceed with updating the workload distribution across all departments. Problem description: Interviews have shown that several departments at AB GotlandsHem are experiencing unclarity about workload distribution. The workload distribution is uneven, unclear and unstructured, which leads to high workload. This can be seen in the long lead times at the inspection department. Theory: Several theories have been used to analyze the study’s results. The theories being used are the cornerstone model, Kotter’s eight-step-model, workload distribution and Lean. Method: The project was performed as a case study where an interview, observations, a diary and reviewing of existing documents was used. The collected data was analyzed by transcription, histogram and bar chart. Results: The study’s results demonstrate that the developed template is effective. It further shows that the inspection department has significantly more actual hours then available hours, and several work activities within the department are irrelevant for them to perform. AB GotlandsHem consists of multiple sub-processes, each with a process leader responsible for the processes in the company. Conclusion: The study reveals that most irrelevant activities can be moved to another process. If this relocation is implemented, the inspection department would reduce its actual hours and therefore decrease lead time since their available hours would be sufficient. Therefore, process leaders are responsible for reviewing work activities within their processes, allowing the company to map the departments irrelevant activities and the place more relevant for them. Key words: cornerstone model, customer focus, inspection, lead times, process improvement, workload distribution Fel! Hittar inte referenskälla., Uppsala universitet. Fel! Hittar inte referenskälla.. Handledare: Fel! Hittar inte referenskälla., Ämnesgranskare: Fel! Hittar inte referenskälla., Examinator: Fel! Hittar inte referenskälla. / Sammanfattning Inledning: Fallstudien har genomförts på Gotlands största bostadsförmedling, AB GotlandsHem. Studien har två syften. Det första syftet är att kartlägga och undersöka arbetsfördelningen på besiktningsavdelningen genom att ta fram en generaliserbar mall. Det andra syftet är att undersöka hur AB GotlandsHem kan gå tillväga för att uppdatera arbetsfördelningen inom samtliga av organisationens avdelningar. Problembeskrivning: Flera avdelningar inom AB GotlandsHem har vid intervjuer beskrivit oklarhet inom arbetsfördelningen. Arbetsfördelningen är ojämn, otydlig och ostrukturerad vilket kräver mer arbetstid än nödvändigt. Inom besiktningsavdelningen syns den otydliga arbetsfördelningen på de långa ledtiderna. Teori: Flera teorier har använts för att analysera studiens resultat. De teorier som använts är hörnstensmodellen, Kotters åttastegsmodell, arbetsfördelning och Lean. Metod: En kvalitativ fallstudie har genomförts och datainsamlingsmetoderna som använts är en intervju, observationer, en dagbok och dokumentgranskning. Data har analyserats med hjälp av transkribering, histogram och stolpdiagram. Resultat: Studiens resultat visar på att mallen som tagits fram fungerar. Mallen har i sin tur visat att besiktningsavdelningen har många fler timmar faktiskt tid än disponibel tid, samt att flera arbetsaktiviteter inom avdelningen anses irrelevanta för dem att genomföra. Vidare har det kartlagts att AB GotlandsHem består av flera delprocesser som har varsin processledare som ansvarar för de processer som finns inom företaget. Slutsatser: Studien visar att majoriteten av de irrelevanta aktiviteterna kan förflyttas till en annan process där aktiviteterna anses mer passande att utföra. Skulle förflyttningen genomföras skulle besiktningsavdelningen minska sina faktiska timmar och på det viset minska ledtiderna då deras disponibla timmar räcker till. Processledarna bör ansvara för att se över arbetsaktiviteterna inom sin process för att kartlägga avdelningarnas aktiviteter och avgöra om samtliga är relevanta för avdelningen.
|
5 |
Combinatorial Optimization for Data Center Operational Cost ReductionRostami, Somayye January 2023 (has links)
This thesis considers two kinds of problems, motivated by practical applications in
data center operations and maintenance. Data centers are the brain of the internet,
each hosting as many as tens of thousands of IT devices, making them a considerable global energy consumption contributor (more than 1 percent of global power
consumption). There is a large body of work at different layers aimed at reducing
the total power consumption for data centers. One of the key places to save power
is addressing the thermal heterogeneity in data centers by thermal-aware workload
distribution. The corresponding optimization problem is challenging due to its combinatorial nature and the computational complexity of thermal models. In this thesis,
a holistic theoretical approach is proposed for thermal-aware workload distribution
which uses linearization to make the problem model-independent and easier to study.
Two general optimization problems are defined. In the first problem, several cooling
parameters and heat recirculation effects are considered, where two red-line temperatures are defined for idle and fully utilized servers to allow the cooling effort to be
reduced. The resulting problem is a mixed integer linear programming problem which
is solved approximately using a proposed heuristic. Numerical results confirm that
the proposed approach outperforms commonly considered baseline algorithms and commercial solvers (MATLAB) and can reduce the power consumption by more than
10 percent. In the next problem, additional operational costs related to reliability
of the servers are considered. The resulting problem is solved by a generalization of
the proposed heuristics integrated with a Model Predictive Control (MPC) approach,
where demand predictions are available. Finally, in the second type of problems,
we address a problem in inventory management related to data center maintenance,
where we develop an efficient dynamic programming algorithm to solve a lot-sizing
problem. The algorithm is based on a key structural property that may be of more
general interest, that of a just-in-time ordering policy. / Thesis / Doctor of Philosophy (PhD) / Data centers, each hosting as many as tens of thousands of IT devices, contribute to a
considerable portion of energy usage worldwide (more than 1 percent of global power
consumption). They also encounter other operational costs mostly related to reliability of devices and maintenance. One of the key places to reduce energy consumption is
through addressing the thermal heterogeneity in data centers by thermal-aware work load distribution for the servers. This prevents hot spot generation and addresses the
trade-off between IT and cooling power consumption, the two main power consump tion contributors. The corresponding optimization problem is challenging due to its
combinatorial nature and the complexity of thermal models. In this thesis, we present
a holistic approach for thermal-aware workload distribution in data centers, using lin earization to make the problem model-independent and simpler to study. Two quite
general nonlinear optimization problems are defined. The results confirm that the
proposed approach completed by a proposed heuristic solves the problems efficiently
and with high precision. Finally, we address a problem in inventory management
related to data center maintenance, where we develop an efficient algorithm to solve
a lot-sizing problem that has a goal of reducing data center operational costs.
|
Page generated in 0.091 seconds