Spelling suggestions: "subject:"bobust optimization"" "subject:"arobust optimization""
131 |
Novel Application Models and Efficient Algorithms for Offloading to CloudsGonzález Barrameda, José Andrés January 2017 (has links)
The application offloading problem for Mobile Cloud Computing aims at improving the mobile user experience by leveraging the resources of the cloud. The execution of the mobile application is offloaded to the cloud, saving energy at the mobile device or speeding up the execution of the application. We improve the accuracy and performance of application offloading solutions in three main directions. First, we propose a novel fine-grained application model that supports complex module dependencies such as sequential, conditional and parallel module executions. The model also allows for multiple offloading decisions that are tailored towards the current application, network, or user contexts. As a result, the model is more precise in capturing the structure of the application and supports more complex offloading solutions. Second, we propose three cost models, namely, average-based, statistics-based and interval-based cost models, defined for the proposed application model. The average-based approach models each module cost by the expected cost value, and the expected cost of the entire application is estimated considering each of the three module dependencies. The novel statistics-based cost model employs Cumulative Distribution Function (CDFs) to represent the costs of the modules and of the mobile application, which is estimated considering the cost and dependencies of the modules. This cost model opens the doors for new statistics-based optimization functions and constraints whereas the state of the art only support optimizations based on the average running cost of the application. Furthermore, this cost model can be used to perform statistical analysis of the performance of the application in different scenarios such as varying network data rates. The last cost model, the interval-based, represents the module costs via intervals in order to addresses the cost uncertainty while having lower requirements and computational complexity than the statistics-based model. The cost of the application is estimated as an expected maximum cost via a linear optimization function. Finally, we present offloading decision algorithms for each cost model. For the average-based model, we present a fast optimal dynamic programming algorithm. For the statistics-based model, we present another fast optimal dynamic programming algorithm for the scenario where the optimization function meets specific properties. Finally, for the interval-based cost model, we present a robust formulation that solves a linear number of linear optimization problems. Our evaluations verify the accuracy of the models and show higher cost savings for our solutions when compared to the state of the art.
|
132 |
Dimensionnement et optimisation des réseaux de collecte sans fil / Design and optimization of wireless backhaul networksKodjo, Alvinice 18 December 2014 (has links)
L’essentiel des travaux de cette thèse porte sur les réseaux de collectes de données sans fil. Nous avons étudié différents problèmes d’optimisation dans ces réseaux qui représentent de vrais challenges pour les industriels du secteur. Le premier problème porte sur l’allocation de capacités sur les liens à coût minimum. Il a été résolu par une approche de programmation linéaire avec génération de colonnes. Notre modèle permet de résoudre des problèmes de grandes tailles. Nous avons ensuite étudié le problème du partage d’infrastructure réseau entre opérateurs virtuels avec comme objectif de maximiser les revenus de l’opérateur de l’infrastructure physique tout en satisfaisant les demandes et les contraintes de qualité de service des opérateurs virtuels clients du réseau. Dans ce contexte, nous avons proposé une formulation robuste du problème en programmation linéaire en nombres entiers mixte. Un autre point de dépenses dans ce type de réseau est la consommation d’énergie. Nous avons proposé une solution robuste, de routage basée sur la consommation d’énergie du réseau. Notre solution a été formulée en utilisant un programme linéaire en nombre entiers mixte. Nous avons aussi proposé des heuristiques afin de trouver assez rapidement des solutions pour de grandes instances. Le dernier travail de cette thèse porte sur les réseaux radio cognitifs et plus précisément sur le problème de partage de bande passante. Nous l’avons formalisé en utilisant un programme linéaire mais avec une autre approche d’optimisation robuste. Nous utilisons la méthode d'optimisation robuste à 2 niveaux pour le résoudre. / The main work of this thesis focuses on the wireless backhaul networks. We studied different optimization problems in such networks that represent real challenges for industrial sector.The first issue addressed focuses on the capacity allocation on the links at minimum cost. It was solved by a linear programming approach with column generation. Our method solves the problems on large size networks. We then studied the problem of network infrastructure sharing between virtual operators. The objective is to maximize the revenue of the operator of the physical infrastructure while satisfying the quality of service constraints of virtual operators customers of the network. In this context, we proposed a robust model using mixed integer linear programming. In the following problem, we proposed a robust energy-aware routing solution for the network operators to reduce their energy consumption. Our solution was formulated using a mixed integer linear program. We also proposed heuristics to find efficient solutions for large networks. The last work of this thesis focuses on cognitive radio networks and more specifi- cally on the problem of bandwidth sharing. We formalized it using a linear program with a different approach to robust optimization. We based our solution on the 2-stage linear robust method.
|
133 |
Robust Deep Reinforcement Learning for Portfolio ManagementMasoudi, Mohammad Amin 27 September 2021 (has links)
In Finance, the use of Automated Trading Systems (ATS) on markets is growing every year and the trades generated by an algorithm now account for most of orders that arrive at stock exchanges (Kissell, 2020). Historically, these systems were based on advanced statistical methods and signal processing designed to extract trading signals from financial data. The recent success of Machine Learning has attracted the interest of the financial community. Reinforcement Learning is a subcategory of machine learning and has been broadly applied by investors and researchers in building trading systems (Kissell, 2020). In this thesis, we address the issue that deep reinforcement learning may be susceptible to sampling errors and over-fitting and propose a robust deep reinforcement learning method that integrates techniques from reinforcement learning and robust optimization. We back-test and compare the performance of the developed algorithm, Robust DDPG, with UBAH (Uniform Buy and Hold) benchmark and other RL algorithms and show that the robust algorithm of this research can reduce the downside risk of an investment strategy significantly and can ensure a safer path for the investor’s portfolio value.
|
134 |
Robustní optimalizace v klasifikačních a regresních úlohách / Robust optimization in classification and regression problemsSemela, Ondřej January 2016 (has links)
In this thesis, we present selected methods of regression and classification analysis in terms of robust optimization which aim to compensate for data imprecisions and measurement errors. In the first part, ordinary least squares method and its generalizations derived within the context of robust optimization - ridge regression and Lasso method are introduced. The connection between robust least squares and stated generalizations is also shown. Theoretical results are accompanied with simulation study investigating from a different perspective the robustness of stated methods. In the second part, we define a modern classification method - Support Vector Machines (SVM). Using the obtained knowledge, we formulate a robust SVM method, which can be applied in robust classification. The final part is devoted to the biometric identification of a style of typing and an individual based on keystroke dynamics using the formulated theory. Powered by TCPDF (www.tcpdf.org)
|
135 |
Material design using surrogate optimization algorithmKhadke, Kunal R. 28 February 2015 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Nanocomposite ceramics have been widely studied in order to tailor desired properties at high temperatures. Methodologies for development of material design are still under effect. While finite element modeling (FEM) provides significant insight on material behavior, few design researchers have addressed the design paradox that accompanies this rapid design space expansion. A surrogate optimization model management framework has been proposed to make this design process tractable. In the surrogate optimization material design tool, the analysis cost is reduced by performing simulations on the surrogate model instead of high fidelity finite element model. The methodology is incorporated to and the optimal number of silicon carbide (SiC) particles, in a silicon-nitride(Si3N4) composite with maximum fracture energy [2]. Along with a deterministic optimization algorithm, model uncertainties have also been considered with the use of robust design optimization (RDO) method ensuring a design of minimum sensitivity to changes in the parameters. These methodologies applied to nanocomposites design have a significant impact on cost and design cycle time reduced.
|
136 |
Risk-Averse and Distributionally Robust Optimization:Methodology and ApplicationsRahimian, Hamed 11 October 2018 (has links)
No description available.
|
137 |
An Interactive Intelligent Decision Support System for Integration of Inventory, Planning, Scheduling and Revenue ManagementArdjmand, Ehsan 17 September 2015 (has links)
No description available.
|
138 |
Integrated and Coordinated Relief Logistics Planning Under Uncertainty for Relief Logistics OperationsKamyabniya, Afshin 22 September 2022 (has links)
In this thesis, we explore three critical emergency logistics problems faced by healthcare and humanitarian relief service providers for short-term post-disaster management.
In the first manuscript, we investigate various integration mechanisms (fully integrated horizontal-vertical, horizontal, and vertical resource sharing mechanisms) following a natural disaster for a multi-type whole blood-derived platelets, multi-patient logistics network. The goal is to reduce the amount of shortage and wastage of multi-blood-group of platelets in the response phase of relief logistics operations. To solve the logistics model for a large scale problem, we develop a hybrid exact solution approach involving an augmented epsilon-constraint and Lagrangian relaxation algorithms and demonstrate the model's applicability for a case study of an earthquake. Due to uncertainty in the number of injuries needing multi-type blood-derived platelets, we apply a robust optimization version of the proposed model which captures the expected performance of the system. The results show that the performance of the platelets logistics network under coordinated and integrated mechanisms better control the level of shortage and wastage compared with that of a non-integrated network.
In the second manuscript, we propose a two-stage casualty evacuation model that involves routing of patients with different injury levels during wildfires. The first stage deals with field hospital selection and the second stage determines the number of patients that can be transferred to the selected hospitals or shelters via different routes of the evacuation network. The goal of this model is to reduce the evacuation response time, which ultimately increase the number of evacuated people from evacuation assembly points under limited time windows. To solve the model for large-scale problems, we develop a two-step meta-heuristic algorithm. To consider multiple sources of uncertainty, a flexible robust approach considering the worst-case and expected performance of the system simultaneously is applied to handle any realization of the uncertain parameters. The results show that the fully coordinated evacuation model in which the vehicles can freely pick up and off-board the patients at different locations and are allowed to start their next operations without being forced to return to the departure point (evacuation assembly points) outperforms the non-coordinated and non-integrated evacuation models in terms of number of evacuated patients.
In the third manuscript, we propose an integrated transportation and hospital capacity model to optimize the assignment of relevant medical resources to multi-level-injury patients in the time of a MCI. We develop a finite-horizon MDP to efficiently allocate resources and hospital capacities to injured people in a dynamic fashion under limited time horizon. We solve this model using the linear programming approach to ADP, and by developing a two-phase heuristics based on column generation algorithm. The results show better policies can be derived for allocating limited resources (i.e., vehicles) and hospital capacities to the injured people compared with the benchmark.
Each paper makes a worthwhile contribution to the humanitarian relief operations literature and can help relief and healthcare providers optimize resource and service logistics by applying the proposed integration and coordination mechanisms.
|
139 |
Robust and Equitable Public Health Screening Strategies, with Application to Genetic and Infectious DiseasesEl Hajj, Hussein Mohammad 07 June 2021 (has links)
Public health screening plays an important role in the overall healthcare system. As an example, consider newborn screening, a state-level initiative that screens newborns for life-threatening genetic disorders for which early treatment can substantially improve health outcomes. Another topical example is in the realm of infectious disease screening, e.g., screening for COVID-19.
The common features of both public health screening problems include large testing populations and resource limitations that inhibit screening efforts. Cost is a major barrier to the inclusion of genetic disorders in newborn screening, and thus screening must be both highly accurate and efficient; and for COVID-19, limited testing kits, and other shortages, have been major barriers to screening efforts. Further, for both newborn screening and infectious disease screening, equity (reducing health disparities among different sub-populations) is an important consideration.
We study the testing process design for newborn screening for genetic diseases, considering cystic fibrosis as a model disorder. Our optimization-based models take into account disease-related parameters, subject risk factors, test characteristics, parameter uncertainty, and limited testing resources so as to design equitable, accurate, and robust screening processes that classify newborns as positive or negative for cystic fibrosis. Our models explicitly consider the trade-off between false-negatives, which lead to missed diagnoses, and the required testing resources; and the trade-off between the accuracy and equity of screening. We also study the testing process design for infectious disease screening, considering COVID-19 as a model disease. Our optimization-based models account for key subject risk factors that are important to consider, including the likelihood of being disease-positive, and the potential harm that could be averted through testing and the subsequent interventions. Our objectives include the minimization of harm (through detection and mitigation) or maximization of testing coverage.
These are complex problems. We develop novel mathematical models and characterize key structural properties of optimal solutions. This, in turn, allows the development of effective and efficient algorithms that exploit these structural properties. These algorithms are either polynomial- or pseudo-polynomial-time algorithms, and are able to solve realistic-sized problems efficiently. Our case studies on cystic fibrosis screening and COVID-19 screening, based on realistic data, underscore the value of the proposed optimization-based approaches for public health screening, compared to current practices. Our findings have important implications for public policy. / Doctor of Philosophy / Public health screening plays an important role in the overall healthcare system. As an example, consider newborn screening, a state-level initiative that screens newborns for life-threatening genetic disorders for which early treatment can substantially improve health outcomes. Another topical example is in the realm of infectious disease screening, e.g., screening for COVID-19.
The common features of both public health screening problems include large testing populations and resource limitations that inhibit screening efforts. Cost is a major barrier to the inclusion of genetic disorders in newborn screening, and thus screening must be both highly accurate and efficient; and for COVID-19, limited testing kits, and other shortages, have been major barriers to screening efforts. Further, for both newborn screening and infectious disease screening, equity (reducing health disparities among different sub-populations) is an important consideration.
We study the testing process design for newborn screening for genetic diseases, considering cystic fibrosis as a model disorder. Our optimization-based models take into account disease-related parameters, subject risk factors, test characteristics, parameter uncertainty, and limited testing resources so as to design screening processes that classify newborns as positive or negative for cystic fibrosis. Our models explicitly consider the trade-off between false-negatives, which lead to missed diagnoses, and the required testing resources; and the trade-off between the accuracy and equity of screening. We also study the testing process design for infectious disease screening, considering COVID-19 as a model disease. Our optimization-based models account for key subject risk factors that are important to consider, including the likelihood of being disease-positive, and the potential harm that could be averted through testing and the subsequent interventions. Our objectives include the minimization of harm (through detection and mitigation) or maximization of testing coverage.
These are complex problems. We develop novel mathematical models and characterize key structural properties of optimal solutions. This, in turn, allows the development of effective and efficient algorithms that exploit these structural properties. Our case studies on cystic fibrosis screening and COVID-19 screening, based on realistic data, underscore the value of the proposed optimization-based approaches for public health screening, compared to current practices. Our findings have important implications for public policy.
|
140 |
Asset-liability modelling and pension schemes: the application of robust optimization to USSPlatanakis, Emmanouil, Sutcliffe, C. 08 May 2015 (has links)
Yes / This paper uses a novel numerical optimization technique – robust optimization – that is well suited to
solving the asset–liability management (ALM) problem for pension schemes. It requires the estimation
of fewer stochastic parameters, reduces estimation risk and adopts a prudent approach to asset allocation.
This study is the first to apply it to a real-world pension scheme, and the first ALM model of a pension
scheme to maximize the Sharpe ratio. We disaggregate pension liabilities into three components –
active members, deferred members and pensioners, and transform the optimal asset allocation into the
scheme’s projected contribution rate. The robust optimization model is extended to include liabilities and
used to derive optimal investment policies for the Universities Superannuation Scheme (USS), benchmarked
against the Sharpe and Tint, Bayes–Stein and Black–Litterman models as well as the actual USS
investment decisions. Over a 144-month out-of-sample period, robust optimization is superior to the four
benchmarks across 20 performance criteria and has a remarkably stable asset allocation – essentially
fix-mix. These conclusions are supported by six robustness checks.
|
Page generated in 0.0949 seconds