Spelling suggestions: "subject:"workload 3dmodeling"" "subject:"workload bymodeling""
1 |
Workload modeling and prediction for resources provisioning in cloudMagalhães, Deborah Maria Vieira 23 February 2017 (has links)
MAGALHÃES, Deborah Maria Vieira. Workload modeling and prediction for resources provisioning in cloud. 2017. 100 f. Tese (Doutorado em Engenharia de Teleinformática)–Centro de Tecnologia, Universidade Federal do Ceará, Fortaleza, 2017. / Submitted by Hohana Sanders (hohanasanders@hotmail.com) on 2017-06-02T16:11:24Z
No. of bitstreams: 1
2017_tese_dmvmagalhães.pdf: 5119492 bytes, checksum: 581c09b1ba042cf8c653ca69d0aa0d57 (MD5) / Approved for entry into archive by Marlene Sousa (mmarlene@ufc.br) on 2017-06-02T16:18:39Z (GMT) No. of bitstreams: 1
2017_tese_dmvmagalhães.pdf: 5119492 bytes, checksum: 581c09b1ba042cf8c653ca69d0aa0d57 (MD5) / Made available in DSpace on 2017-06-02T16:18:39Z (GMT). No. of bitstreams: 1
2017_tese_dmvmagalhães.pdf: 5119492 bytes, checksum: 581c09b1ba042cf8c653ca69d0aa0d57 (MD5)
Previous issue date: 2017-02-23 / The evaluation of resource management policies in cloud environments is challenging since clouds are subject to varying demand coming from users with different profiles and Quality de Service (QoS) requirements. Factors as the virtualization layer overhead, insufficient trace logs available for analysis, and mixed workloads composed of a wide variety of applications in a heterogeneous environment frustrate the modeling and characterization of applications hosted in the cloud. In this context, workload modeling and characterization is a fundamental step on systematizing the analysis and simulation of the performance of computational resources management policies and a particularly useful strategy for the physical implementation of the clouds. In this doctoral thesis, we propose a methodology for workload modeling and characterization to create resource utilization profiles in Cloud. The workload behavior patterns are identified and modeled in the form of statistical distributions which are used by a predictive controller to establish the complex relationship between resource utilization and response time metric. To this end, the controller makes adjustments in the resource utilization to maintain the response time experienced by the user within an acceptable threshold. Hence, our proposal directly supports QoS-aware resource provisioning policies. The proposed methodology was validated through two different applications with distinct characteristics: a scientific application to pulmonary diseases diagnosis, and a web application that emulates an auction site. The performance models were compared with monitoring data through graphical and analytical methods to evaluate their accuracy, and all the models presented a percentage error of less than 10 %. The predictive controller was able to dynamically maintain the response time close to the expected trajectory without Service Level Agreement (SLA) violation with an Mean Absolute Percentage Error (MAPE) = 4.36%. / A avaliação de políticas de gerenciamento de recursos em nuvens computacionais é uma tarefa desafiadora, uma vez que tais ambientes estão sujeitos a demandas variáveis de usuários com diferentes perfis de comportamento e expectativas de Qualidade de Serviço (QoS). Fatores como overhead da camada de virtualização, indisponibilidade de dados e complexidade de cargas de trabalho altamente heterogêneas dificultam a modelagem e caracterização de aplicações hospedadas em nuvens. Neste contexto, caracterizar e modelar a carga de trabalho (ou simples- mente carga) é um passo importante na sistematização da análise e simulação do desempenho de políticas de gerenciamento dos recursos computacionais e uma estratégia particularmente útil antes da implantação física das nuvens. Nesta tese de doutorado, é proposta uma metodologia para modelagem e caracterização de carga visando criar perfis de utilização de recursos em Nuvem. Os padrões de comportamento das cargas são identificados e modelados sob a forma de distribuições estatísticas as quais são utilizadas por um controlador preditivo a fim de estabelecer a complexa relação entre a utilização dos recursos e a métrica de tempo de resposta. Desse modo, o controlador realiza ajustes no percentual de utilização do recursos a fim de manter o tempo de resposta observado pelo o usuário dentro de um limiar aceitável. Assim, nossa proposta apoia diretamente políticas de provisionamento de recursos cientes da Qualidade de Serviço (QoS). A metodologia proposta foi validada através de aplicações com características distintas: uma aplicação científica para o auxílio do diagnóstico de doenças pulmonares e uma aplicação Web que emula um site de leilões. Os modelos de desempenho computacional gerados foram confrontados com os dados reais através de métodos estatísticos gráficos e analíticos a fim de avaliar sua acurácia e todos os modelos apresentaram um percentual de erro inferior a 10%. A modelagem proposta para o controlador preditivo mostrou-se efetiva pois foi capaz de dinamicamente manter o tempo de resposta próximo ao valor esperado, com erro percentual absoluto médio (MAPE ) = 4.36% sem violação de SLA.
|
2 |
Experimental Interrogation Of Network Simulation Models Of Human Task And Workload Performance In A U.S. Army Tactical Operations CenterMiddlebrooks, Sam E. 10 August 2001 (has links)
This thesis research is involved with the development of new methodologies for enhancing the experimental use of computer simulations to optimize predicted human performance in a work domain. Using a computer simulation called Computer modeling Of Human Operator System Tasks (CoHOST) to test the concepts in this research, methods are developed that are used to establish confidence limits and significance thresholds by having the computer model self report its limits. These methods, along with experimental designs that are tailored to the use of computer simulation instead of human subject based research, are used in the CoHOST simulation to investigate the U.S. Army battalion level command and control work domain during combat conditions and develop recommendations about that domain based on the experimental use of CoHOST with these methodologies. Further, with the realization that analytical results showing strictly numerical data do not always satisfy the need for understanding by those who could most benefit from the analysis, the results are further interpreted in accordance with a team performance model and the CoHOST analysis results are mapped to it according to macroergonomic and team performance concepts.
The CoHOST computer simulation models were developed based on Army needs stemming from the Persian Gulf war. They examined human mental and physical performance capabilities resulting from the introduction of a new command and control vehicle with modernized digital communications systems. Literature searches and background investigations were conducted, and the CoHOST model architecture was developed that was based on a taxonomy of human performance. A computer simulation design was implemented with these taxonomic based descriptors of human performance in the military command and control domain using the commercial programming language MicroSaint™. The original CoHOST development project developed results that suggested that automation alone does not necessarily improve human performance.
The CoHOST models were developed to answer questions about whether human operators could operate effectively in a specified work domain. From an analytical point of view this satisfied queries being made from the developers of that work domain. However, with these completed models available, the intriguing possibility now exists to allow an investigation of how to optimize that work domain to maximize predicted human performance. By developing an appropriate experimental design that allows evaluative conditions to be placed on the simulated human operators in the computer model rather than live human test subjects, a series of computer runs are made to establish test points for identified dependent variables against specified independent variables. With these test points a set of polynomial regression equations are developed that describe the performance characteristics according to these dependent variables of the human operator in the work domain simulated in the model. The resulting regression equations are capable of predicting any outcome the model can produce. The optimum values for the independent variables are then determined that produce the maximum predicted human performance according to the dependent variables.
The conclusions from the CoHOST example in this thesis complement the results of the original CoHOST study with the prediction that the primary attentional focus of the battalion commander during combat operations is on establishing and maintaining an awareness and understanding of the situational picture of the battlefield he is operating upon. Being able to form and sustain an accurate mental model of this domain is the predicted predominant activity and drives his ability to make effective decisions and communicate those decisions to the other members of his team and to elements outside his team.
The potential specific benefit of this research to the Army is twofold. First, the research demonstrates techniques and procedures that can be used without any required modifications to the existing computer simulations that allow significant predictive use to be made of the simulation beyond its original purpose and intent. Second, the use of these techniques with CoHOST is developing conclusions and recommendations from that simulation that Army force developers can use with their continuing efforts to improve and enhance the ability of commanders and other decision makers to perform as new digital communications systems and procedures are producing radical changes to the paradigm that describes the command and control work domain.
The general benefits beyond the Army domain of this research fall into the two areas of methodological improvement of simulation based experimental procedures and in the actual application area of the CoHOST simulation. Tailoring the experimental controls and development of interrogation techniques for the self-reporting and analysis of simulation parameters and thresholds are topics that bode for future study. The CoHOST simulation, while used in this thesis as an example of new and tailored techniques for computer simulation based research, has nevertheless produced conclusions that deviate somewhat from prevailing thought in military command and control. Refinement of this simulation and its use in an even more thorough simulation based study could further address whether the military decision making process itself or contributing factors such as development of mental models for understanding of the situation is or should be the primary focus of team decision makers in the military command and control domain. / Master of Science
|
3 |
Statistical Methods for Computational Markets : Proportional Share Market Prediction and Admission ControlSandholm, Thomas January 2008 (has links)
We design, implement and evaluate statistical methods for managing uncertainty when consuming and provisioning resources in a federated computational market. To enable efficient allocation of resources in this environment, providers need to know consumers' risk preferences, and the expected future demand. The guarantee levels to offer thus depend on techniques to forecast future usage and to accurately capture and model uncertainties. Our main contribution in this thesis is threefold; first, we evaluate a set of techniques to forecast demand in computational markets; second, we design a scalable method which captures a succinct summary of usage statistics and allows consumers to express risk preferences; and finally we propose a method for providers to set resource prices and determine guarantee levels to offer. The methods employed are based on fundamental concepts in probability theory, and are thus easy to implement, as well as to analyze and evaluate. The key component of our solution is a predictor that dynamically constructs approximations of the price probability density and quantile functions for arbitrary resources in a computational market. Because highly fluctuating and skewed demand is common in these markets, it is difficult to accurately and automatically construct representations of arbitrary demand distributions. We discovered that a technique based on the Chebyshev inequality and empirical prediction bounds, which estimates worst case bounds on deviations from the mean given a variance, provided the most reliable forecasts for a set of representative high performance and shared cluster workload traces. We further show how these forecasts can help the consumers determine how much to spend given a risk preference and how providers can offer admission control services with different guarantee levels given a recent history of resource prices. / QC 20100909
|
4 |
The COMPASS Paradigm For The Systematic Evaluation Of U.S. Army Command And Control Systems Using Neural Network And Discrete Event Computer SimulationMiddlebrooks, Sam E. 15 April 2003 (has links)
In today's technology based society the rapid proliferation of new machines and systems that would have been undreamed of only a few short years ago has become a way of life. Developments and advances especially in the areas of digital electronics and micro-circuitry have spawned subsequent technology based improvements in transportation, communications, entertainment, automation, the armed forces, and many other areas that would not have been possible otherwise. This rapid "explosion" of new capabilities and ways of performing tasks has been motivated as often as not by the philosophy that if it is possible to make something better or work faster or be more cost effective or operate over greater distances then it must inherently be good for the human operator. Taken further, these improvements typically are envisioned to consequently produce a more efficient operating system where the human operator is an integral component. The formal concept of human-system interface design has only emerged this century as a recognized academic discipline, however, the practice of developing ideas and concepts for systems containing human operators has been in existence since humans started experiencing cognitive thought.
An example of a human system interface technology for communication and dissemination of written information that has evolved over centuries of trial and error development, is the book. It is no accident that the form and shape of the book of today is as it is. This is because it is a shape and form readily usable by human physiology whose optimal configuration was determined by centuries of effort and revision. This slow evolution was mirrored by a rate of technical evolution in printing and elsewhere that allowed new advances to be experimented with as part of the overall use requirement and need for the existence of the printed word and some way to contain it. Today, however, technology is advancing at such a rapid rate that evolutionary use requirements have no chance to develop along side the fast pace of technical progress. One result of this recognition is the establishment of disciplines like human factors engineering that have stated purposes and goals of systematic determination of good and bad human system interface designs. However, other results of this phenomenon are systems that get developed and placed into public use simply because new technology allowed them to be made. This development can proceed without a full appreciation of how the system might be used and, perhaps even more significantly, what impact the use of this new system might have on the operator within it.
The U.S. Army has a term for this type of activity. It is called "stove-piped development". The implication of this term is that a system gets developed in isolation where the developers are only looking "up" and not "around". They are thus concerned only with how this system may work or be used for its own singular purposes as opposed to how it might be used in the larger community of existing systems and interfaces or, even more importantly, in the larger community of other new systems in concurrent development. Some of the impacts for the Army from this mode of system development are communication systems that work exactly as designed but are unable to interface to other communications systems in other domains for battlefield wide communications capabilities. Having communications systems that cannot communicate with each other is a distinct problem in its own right. However, when developments in one industry produce products that humans use or attempt to use with products from totally separate developments or industries, the Army concept of product development resulting from stove-piped design visions can have significant implication on the operation of each system and the human operator attempting to use it.
There are many examples that would illustrate the above concept, however, one that will be explored here is the Army effort to study, understand, and optimize its command and control (C2) operations. This effort is at the heart of a change in the operational paradigm in C2 Tactical Operations Centers (TOCs) that the Army is now undergoing. For the 50 years since World War II the nature, organization, and mode of the operation of command organizations within the Army has remained virtually unchanged. Staffs have been organized on a basic four section structure and TOCs generally only operate in a totally static mode with the amount of time required to move them to keep up with a mobile battlefield going up almost exponentially from lower to higher command levels. However, current initiatives are changing all that and while new vehicles and hardware systems address individual components of the command structures to improve their operations, these initiatives do not necessarily provide the environment in which the human operator component of the overall system can function in a more effective manner.
This dissertation examines C2 from a system level viewpoint using a new paradigm for systematically examining the way TOCs operate and then translating those observations into validated computer simulations using a methodological framework. This paradigm is called COmputer Modeling Paradigm And Simulation of Systems (COMPASS). COMPASS provides the ability to model TOC operations in a way that not only includes the individuals, work groups and teams in it, but also all of the other hardware and software systems and subsystems and human-system interfaces that comprise it as well as the facilities and environmental conditions that surround it.
Most of the current literature and research in this area focuses on the concept of C2 itself and its follow-on activities of command, control, communications (C3), command, control, communications, and computers (C4), and command, control, communications, computers and intelligence (C4I). This focus tends to address the activities involved with the human processes within the overall system such as individual and team performance and the commander's decision-making process. While the literature acknowledges the existence of the command and control system (C2S), little effort has been expended to quantify and analyze C2Ss from a systemic viewpoint. A C2S is defined as the facilities, equipment, communications, procedures, and personnel necessary to support the commander (i.e., the primary decision maker within the system) for conducting the activities of planning, directing, and controlling the battlefield within the sector of operations applicable to the system.
The research in this dissertation is in two phases. The overall project incorporates sequential experimentation procedures that build on successive TOC observation events to generate an evolving data store that supports the two phases of the project. Phase I consists of the observation of heavy maneuver battalion and brigade TOCs during peacetime exercises. The term "heavy maneuver" is used to connotate main battle forces such as armored and mechanized infantry units supported by artillery, air defense, close air, engineer, and other so called combat support elements. This type of unit comprises the main battle forces on the battlefield. It is used to refer to what is called the conventional force structure. These observations are conducted using naturalistic observation techniques of the visible functioning of activities within the TOC and are augmented by automatic data collection of such things as analog and digital message traffic, combat reports generated by the computer simulations supporting the wargame exercise, and video and audio recordings where appropriate and available. Visible activities within the TOC include primarily the human operator functions such as message handling activities, decision-making processes and timing, coordination activities, and span of control over the battlefield. They also include environmental conditions, functional status of computer and communications systems, and levels of message traffic flows. These observations are further augmented by observer estimations of such indicators as perceived level of stress, excitement, and level of attention to the mission of the TOC personnel. In other words, every visible and available component of the C2S within the TOC is recorded for analysis. No a priori attempt is made to evaluate the potential significance of each of the activities as their contribution may be so subtle as to only be ascertainable through statistical analysis. Each of these performance activities becomes an independent variable (IV) within the data that is compared against dependent variables (DV) identified according to the mission functions of the TOC. The DVs for the C2S are performance measures that are critical combat tasks performed by the system. Examples of critical combat tasks are "attacking to seize an objective", "seizure of key terrain", and "river crossings'. A list of expected critical combat tasks has been prepared from the literature and subject matter expert (SME) input. After the exercise is over, the success of these critical tasks attempted by the C2S during the wargame are established through evaluator assessments, if available, and/or TOC staff self analysis and reporting as presented during after action reviews.
The second part of Phase I includes datamining procedures, including neural networks, used in a constrained format to analyze the data. The term constrained means that the identification of the outputs/DV is known. The process was to identify those IV that significantly contribute to the constrained DV. A neural network is then constructed where each IV forms an input node and each DV forms an output node. One layer of hidden nodes is used to complete the network. The number of hidden nodes and layers is determined through iterative analysis of the network. The completed network is then trained to replicate the output conditions through iterative epoch executions. The network is then pruned to remove input nodes that do not contribute significantly to the output condition. Once the neural network tree is pruned through iterative executions of the neural network, the resulting branches are used to develop algorithmic descriptors of the system in the form of regression like expressions.
For Phase II these algorithmic expressions are incorporated into the CoHOST discrete event computer simulation model of the C2S. The programming environment is the commercial programming language Micro Saintä running on a PC microcomputer. An interrogation approach was developed to query these algorithms within the computer simulation to determine if they allow the simulation to reflect the activities observed in the real TOC to within an acceptable degree of accuracy.
The purpose of this dissertation is to introduce the COMPASS concept that is a paradigm for developing techniques and procedures to translate as much of the performance of the entire TOC system as possible to an existing computer simulation that would be suitable for analyses of future system configurations.
The approach consists of the following steps:
• Naturalistic observation of the real system using ethnographic techniques.
• Data analysis using datamining techniques such as neural networks.
• Development of mathematical models of TOC performance activities.
• Integration of the mathematical into the CoHOST computer simulation.
• Interrogation of the computer simulation.
• Assessment of the level of accuracy of the computer simulation.
• Validation of the process as a viable system simulation approach. / Ph. D.
|
Page generated in 0.071 seconds