Return to search

PERFORMANCE ASSURANCE FOR CLOUD-NATIVE APPLICATIONS

Preserving the performance of cloud services according to service level agreements (SLAs) is one of the most important challenges in cloud infrastructure. Since the workload is always changing incrementally or decremental, managing the cloud resources efficiently is considered an important challenge to satisfy non-functional requirements like high availability and cost. Although many common approaches like predictive autoscaling could solve this problem, it is still not so efficient because of its constraints like requiring a workload pattern as training data. Reinforcement machine learning (RL) can be considered a significant solution for this problem. Even though reinforcement learning needs some time to be stable and needs many trials to decide the value of factors like discount rate, this approach can adapt with the dynamic workload. In this  thesis, through a controlled experiment research method, we show how a model-free reinforcement algorithm like Q-learning can adapt to the dynamic workload by applying horizontal autoscaling to keep the performance of cloud services at the required level. Furthermore, the Amazon web services (AWS) platform is used to demonstrate the efficiency of the Q-learning algorithm in dealing with dynamic workload and achieving high availability.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:mdh-56089
Date January 2021
CreatorsZabad, Bassam
PublisherMälardalens högskola, Akademin för innovation, design och teknik
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0018 seconds