• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 25
  • 25
  • 10
  • 8
  • 8
  • 7
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Scheduling Distributed Real-Time Tasks in Unreliable and Untrustworthy Systems

Han, Kai 06 May 2010 (has links)
In this dissertation, we consider scheduling distributed soft real-time tasks in unreliable (e.g., those with arbitrary node and network failures) and untrustworthy systems (e.g., those with Byzantine node behaviors). We present a distributed real-time scheduling algorithm called Gamma. Gamma considers a distributed (i.e., multi-node) task model where tasks are subject to Time/Utility Function (or TUF) end-to-end time constraints, and the scheduling optimality criterion of maximizing the total accrued utility. The algorithm makes three novel contributions. First, Gamma uses gossip for reliably propagating task scheduling parameters and for discovering task execution nodes. Second, Gamma achieves distributed real-time mutual exclusion in unreliable environments. Third, the algorithm guards against potential disruption of message propagation due to Byzantine attacks using a mechanism called Launcher-Attacker-Infective-Susceptible-Immunized-Removed-Consumer (or LAISIRC). By doing so, the algorithm schedules tasks with probabilistic termination-time satisfactions, despite system unreliability and untrustworthiness. We analytically establish several timeliness and non-timeliness properties of the algorithm including probabilistic end-to-end task termination time satisfactions, optimality of message overheads, mutual exclusion guarantees, and the mathematical model of the LAISIRC mechanism. We conducted simulation-based experimental studies and compared Gamma with its competitors. Our experimental studies reveal that Gamma's scheduling algorithm accrues greater utility and satisfies a greater number of deadlines than do competitor algorithms (e.g., HVDF) by as much as 47% and 45%, respectively. LAISIRC is more tolerant to Byzantine attacks than competitor protocols (e.g., Path Verification) by obtaining as much as 28% higher correctness ratio. Gamma's mutual exclusion algorithm accrues greater utility than do competitor algorithms (e.g., EDF-Sigma) by as much as 25%. Further, we implemented the basic Gamma algorithm in the Emulab/ChronOS 250-node testbed, and measured the algorithm's performance. Our implementation measurements validate our theoretical analysis and the algorithm's effectiveness and robustness. / Ph. D.
22

Optimal investment in incomplete financial markets

Schachermayer, Walter January 2002 (has links) (PDF)
We give a review of classical and recent results on maximization of expected utility for an investor who has the possibility of trading in a financial market. Emphasis will be given to the duality theory related to this convex optimization problem. For expository reasons we first consider the classical case where the underlying probability space is finite. This setting has the advantage that the technical diffculties of the proofs are reduced to a minimum, which allows for a clearer insight into the basic ideas, in particular the crucial role played by the Legendre-transform. In this setting we state and prove an existence and uniqueness theorem for the optimal investment strategy, and its relation to the dual problem; the latter consists in finding an equivalent martingale measure optimal with respect to the conjugate of the utility function. We also discuss economic interpretations of these theorems. We then pass to the general case of an arbitrage-free financial market modeled by an R^d-valued semi-martingale. In this case some regularity conditions have to be imposed in order to obtain an existence result for the primal problem of finding the optimal investment, as well as for a proper duality theory. It turns out that one may give a necessary and sufficient condition, namely a mild condition on the asymptotic behavior of the utility function, its so-called reasonable asymptotic elasticity. This property allows for an economic interpretation motivating the term "reasonable". The remarkable fact is that this regularity condition only pertains to the behavior of the utility function, while we do not have to impose any regularity conditions on the stochastic process modeling the financial market (to be precise: of course, we have to require the arbitrage-freeness of this process in a proper sense; also we have to assume in one of the cases considered below that this process is locally bounded; but otherwise it may be an arbitrary R^d-valued semi-martingale). (author's abstract) / Series: Report Series SFB "Adaptive Information Systems and Modelling in Economics and Management Science"
23

權重效用在網路問題上之研究 / A Study on Weighted Utilizations of Network Dimensioning Problems

程雅惠, Cheng,Ya Hui Unknown Date (has links)
我們以公平頻寬配置考慮網路上多重等級與多重服務品質的效用函數, 利用權重效用函數提出兩種數學最佳化模型。 這兩個模型的目標都是要尋找權重效用函數總和值的最大值。 本篇論文特別以權重為決策變數, 研究最佳權重的行為模式, 並求得最佳權重分佈公式。 我們發現模型I的總權重效用只看重某個效用值最大的等級, 完全忽略其他效用值較小的等級; 即最大效用函數的最佳權重為1,其他效用較小的最佳權重為0。 在最佳化過程中, 模型II的數值資料呈現出最佳權重架構為:最佳權重中的每個權重均相等,且總和為1。 我們隨後證明這些結果,並利用GAMS軟體來呈現數值資料。 / We propose two mathematical models with weighted utility functions for the fair bandwidth allocation and QoS routing in communication networks which offer multiple services for several classes of users. The formulation and numerical experiments are carried out in a general utility-maximizing framework. In this work, instead of being fixed, the weight for each utility function is taken as a free variable. The objective of this thesis is to find the structure of optimal weights that maximize the weighted sum of utilities of the bandwidth allocation for each class. We solve it by proposing two models in terms of fairness. Model I and II are constructed to compare different choices for optimal weights. For Model I, the structure of optimal weights form a vector which consists of one for a class and zero otherwise. For Model II, the form of optimal weights is that each weight of utility function is equally assigned. The results are proved and illustrated by software GAMS numerically.
24

INTERNET CONGESTION CONTROL: COMPLETE STABILITY REGION FOR PI AQM AND BANDWIDTH ALLOCATION IN NETWORKED CONTROL

Al-Hammouri, Ahmad Tawfiq January 2008 (has links)
No description available.
25

Théorie des options et fonctions d'utilité : stratégies de couverture en présence des fluctuations non gaussiennes / Options theory and utility functions : hedging strategies in the presence of non-gaussian fluctuations

Hamdi, Haykel 04 March 2011 (has links)
L'approche traditionnelle des produits dérivés consiste, sous certaines hypothèses bien définies, à construire des stratégies de couverture à risque strictement nul. Cependant,dans le cas général ces stratégies de couverture "parfaites" n'existent pas,et la théorie doit plutôt s'appuyer sur une idée de minimisation du risque. Dans ce cas, la couverture optimale dépend de la quantité du risque à minimiser. Dans lecadre des options, on considère dans ce travail une nouvelle mesure du risque vial'approche de l'utilité espérée qui tient compte, à la fois, du moment d'ordre quatre,qui est plus sensible aux grandes fluctuations que la variance, et de l'aversion aurisque de l'émetteur d'une option vis-à-vis au risque. Comparée à la couverture endelta, à l'optimisation de la variance et l'optimisation du moment d'ordre quatre,la stratégie de couverture, via l'approche de l'utilité espérée, permet de diminuer lasensibilité de la couverture par rapport au cours du sous-jacent. Ceci est de natureà réduire les coûts des transactions associées / The traditional approach of derivatives involves, under certain clearly defined hypothesis, to construct hedging strategies for strictly zero risk. However, in the general case these perfect hedging strategies do not exist, and the theory must be rather based on the idea of risk minimization. In this case, the optimal hedging strategy depends on the amount of risk to be minimized. Under the options approach, we consider here a new measure of risk via the expected utility approach that takes into account both, the moment of order four, which is more sensitive to fluctuations than large variance, and risk aversion of the investor of an option towards risk. Compared to delta hedging, optimization of the variance and maximizing the moment of order four, the hedging strategy, via the expected utilitiy approach, reduces the sensitivy of the hedging approach reported in the underlying asset price. This is likely to reduce the associated transaction costs.

Page generated in 0.1118 seconds