• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 484
  • 109
  • 77
  • 77
  • 70
  • 64
  • 19
  • 16
  • 16
  • 15
  • 12
  • 12
  • 7
  • 7
  • 3
  • Tagged with
  • 1164
  • 180
  • 127
  • 125
  • 108
  • 99
  • 95
  • 95
  • 91
  • 88
  • 84
  • 82
  • 76
  • 71
  • 66
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Some behavioral aspects of eliciting utility (using the MacCrimmon-Toda method for ordinal utility and the standard gamble method for cardinal utility)

Wong, Eugene January 1973 (has links)
This study investigates some behavioral aspects and properties of eliciting utility. Previous investigations devoted to empirical utility measurement have stemmed from the work of experimentalists who have applied various utility models in an effort to measure utility. However, empirical studies devoted to investigation into behavioral factors which may bias the measurement are lacking and it is this gap in the utility literature that prompted our empirical study. We chose to examine the standard gamble method for deriving von Heumann-Morgenstern cardinal utility and the MacCrimmon-Toda method for deriving indifference curves. The domain of choice involved hospital days in bed with risk of additional days. The analysis consisted of identifying relationships between behavioral factors and properties of choice predictions obtained by the methods. Furthermore, the study also provided a means for comparing properties of the two methods for eliciting utility. Among other findings, the results show that not all subjects expressed agreement with the appropriateness of specific axioms of behavior which underly some methods for eliciting utility and that not all people express constant sensitivity over all stimuli levels. The two results in themselves suggest that a priori assumptions regarding "rationality" and infinite sensitivity may have to be reexamined. The preferences elicited by both methods seem to suggest that the subjects follow a linear rule to trade-off sure outcome and risk. Although correspondence between test-retest preferences predicted by the standard gamble was generally closer than that for the MacCrimmon-Toda method, the MacCrimmon-Toda method had generally better predictive ability. Our results also indicate that certain behavioral factors seem to affect preferences predicted by the methods as we hypothesized. This observation has implications for practical measurement of utility since "successful" application of methods for eliciting preferences depends upon our awareness of which behavioral factors may bias the measurement. / Business, Sauder School of / Operations and Logistics (OPLOG), Division of / Graduate
22

Application of modern control techniques to power systems

Miniesy, Mohammed Samir Mohammed January 1971 (has links)
A power system may be subjected to different types of disturbances. The control strategy to be taken in order to preserve system stability depends on the severity of the disturbance. For very severe disturbances, power system stability can be improved by sudden changes in the electric power network such as the insertion of braking resistors, generator dropping or load shedding. A unified treatment of optimum switching is presented by considering the switching instants to be elements of a generalized control vector. Dynamic optimization is then applied to determine optimum switching instants. Less severe disturbances can be overcome by employing governor and/or voltage regulator controls. The governor control problem for a large signal model of interconnected power plants is investigated via the multi-level concept. A two-level controller for interconnected power plants is discussed. Each plant has a first-level local optimal or suboptimal controller. The second level of control is an intervention control performed by a central co-ordinator. If a sudden system disturbance causes the system angular acceleration to exceed preset tolerances, a priority interrupt to the central co-ordinator initiates intervention control. Angular velocity deviations of all plants are transmitted to the co-ordinator. This data is used to generate coefficient data for each plant. On receiving its coefficient data, each plant generates a local second-level intervention control which augments first-level local control. The Load-Frequency Control problem, due to minor or routine disturbances caused by load changes, is investigated. Since the incremental power demand in a power system is not always known a priori, direct application of the optimum linear-state regulator to Load-Frequency Control is not possible. Furthermore, Load-Frequency Control generally requires the use of an integral-type control operation to meet the system operating specifications. This requirement is introduced into the formulation of the optimum Load-Frequency Control problem presented in this thesis. Two methods are suggested for demand identification. The first method makes use of differential approximation. The second method makes use of a Luenberger observer to identify unmeasured states. The optimum control is a linear function of measured states, identified unmeasured states, and the identified incremental power demand. A method is given for solving, suboptimally, the problem of optimum-load frequency sampled-data control with either unknown deterministic power demand or randomly varying system disturbances. It is shown how to modify an optimum continuous control to obtain optimum control in the case of discrete-data transmission and unknown deterministic demand. The case of random power demand and random disturbances is treated by introducing an adaptive observer. A three stage systematic design procedure is given. The effectiveness of Load-Frequency Control using an adaptive observer is illustrated by an example. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
23

Studies in utility theory

Larsson, Stig Owe January 1978 (has links)
Since vonNeumann and Morgenstern made their contributions, the expected utility criterion (EUC) has been the most accepted criterion in decision theory. Following their axiomatic approach justifying EUC, several other studies have been made suggesting the same criterion but under slightly different axiomatic systems. However, critics have found several simple decision problems (called paradoxes) which seem to contradict the conclusions of EUC; that is, the paradoxes contradict one or more of the axioms made to support EUC. The criticisms are based on empirical studies made in regard to the paradoxes. It is not always obvious, however, which axiom(s) is not accepted, since each approach to EUC gives a set of sufficient rather than necessary assumptions for EUC to hold. In Part I of the thesis a set of axioms which are necessary for EUC to hold is specified. Each of these axioms contains a basic assumption of a decision maker's behaviour. Therefore by considering the paradoxes in terms of these axioms, a better understanding is obtained with regard to which properties of EUC seem to be contradicted by the paradoxes. The conclusion of this study shows that most people contradict EUC because it does not differentiate between a "known" risk and an "unknown" risk. In Knight's terminology, there is a distinction between decision making under risk and uncertainty. Most empirical studies show that these differences are of such substantial proportions that there is a questionable justification for using the expected utility criterion for decision making under uncertainty. Although many alternatives to EUC for decision making under uncertainty exist, there are very few criteria for decision problems which fall between risk and uncertainty, that is, partial risk problems. Those existing are of an ad hoc nature. As a normative theory the EUC is far superior to any of these criteria in spite of its lack of distinction between risk and uncertainty. In the second part of the thesis an alternative normative criterion is suggested for decision making under partial risk and uncertainty. As an extension of EUC, this criterion distinguishes between risk and uncertainty. This theory expands on Ellsberg's suggestion that "ambiguity" influences one's preference among a set of alternatives. In this extension a more precise definition of "ambiguity" is needed and one is suggested here as a relation on the inner and outer measure of an event. The extension of EUC is then obtained by considering a more general set function, termed P-measure, which would depend on a set's ambiguity rather than a probability measure on the sets of rewards. It is concluded by an axiomatic development that the P-measure must be a non-negative mono-tonic set function which is not necessarily additive. It is also shown that the standard paradoxes related to paradoxes based on "known" versus "unknown" probabilities may be explained by this method and would therefore suggest an alternative to EUC for decision making under partial risk and uncertainty. / Business, Sauder School of / Unknown
24

Synthesizing multiattribute utility functions : a measurement theoretic approach /

Bitters, David Lorin January 1981 (has links)
No description available.
25

An Economic Evaluation of HIV-associated Facial Lipoatrophy Treatments: A Cost-utility Analysis

Peyasantiwong, Sirianong 16 February 2010 (has links)
Introduction: Facial lipoatrophy is a stigmatizing hallmark for HIV-positive status, and can lead to poor social functioning. Information gleaned from an economic evaluation of facial lipoatrophy treatments would inform policy decision making concerning potential public insurance coverage. Methods: A decision-analytic model was used to estimate the lifetime costs and Quality Adjusted Life Years (QALYs) gained from treatments using either poly-l-lactic or and polyalkylimide gel for HIV positive patients. Disease progression probabilities and utilities were derived from the literature. Costs were obtained from interviews with physicians and product distributors. Findings: Incremental costs per QALY were $66,409 CAD/$57,352 CAD for poly-l-lactic acid, and $48,715 CAD/$45,457 CAD for polyalkylimide gel® (Societal perspective/Ministry of Health perspective). Sensitivity analysis did not have a significant effect on the lower incremental costs per QALY reported for polyalkylimide gel. Conclusion: Our base-case analysis revealed that treatments using polyalkylimide gel offers lower ICUR than treatments using poly-l-lactic acid.
26

An Economic Evaluation of HIV-associated Facial Lipoatrophy Treatments: A Cost-utility Analysis

Peyasantiwong, Sirianong 16 February 2010 (has links)
Introduction: Facial lipoatrophy is a stigmatizing hallmark for HIV-positive status, and can lead to poor social functioning. Information gleaned from an economic evaluation of facial lipoatrophy treatments would inform policy decision making concerning potential public insurance coverage. Methods: A decision-analytic model was used to estimate the lifetime costs and Quality Adjusted Life Years (QALYs) gained from treatments using either poly-l-lactic or and polyalkylimide gel for HIV positive patients. Disease progression probabilities and utilities were derived from the literature. Costs were obtained from interviews with physicians and product distributors. Findings: Incremental costs per QALY were $66,409 CAD/$57,352 CAD for poly-l-lactic acid, and $48,715 CAD/$45,457 CAD for polyalkylimide gel® (Societal perspective/Ministry of Health perspective). Sensitivity analysis did not have a significant effect on the lower incremental costs per QALY reported for polyalkylimide gel. Conclusion: Our base-case analysis revealed that treatments using polyalkylimide gel offers lower ICUR than treatments using poly-l-lactic acid.
27

An analysis of multi-attribute utility theory as a model of internal control evaluations by external auditors

Farmer, Timothy Alan January 1983 (has links)
No description available.
28

Optimality of Heuristic Schedulers in Utility Accrual Real-time Scheduling Environments

Basavaraj, Veena 11 July 2006 (has links)
Scheduling decisions in soft real-time environments are based on a utility function. The goal of such schedulers is to use a best-effort approach to maximize the utility function and ensure graceful degradation at overloads. Utility Accrual (UA) schedulers use heuristics to maximize the accrued utility. Heuristic-based scheduling do not always yield the optimal schedule even if there exists one because they do not explore the entire search space of task orderings. In distributed systems, local UA schedulers use the same heuristics along with deadline decomposition for task segments. At present, there has been no evaluation and analysis of the degree to which these polynomial-time, heuristic algorithms succeed in maximizing the total utility accrued. We implemented a preemptive, off-line static scheduling algorithm that performs an exhaustive search of all the possible task orderings to yield the optimal schedules. We simulated two important online dynamic UA schedulers, DASA-ND and LBESA for different system loads, task models, utility and load distribution patterns, and compared their performance with their corresponding optimal schedules. Our experimental analysis indicates that for most scenarios, both DASA-ND and LBESA create optimal schedules. When task utilities are equal or form a geometric sequence with an order of magnitude difference in their utility values, UA schedulers show more than 90% probability of being optimal for single-node workloads. Even though deadline decomposition substantially improves the optimality of both DASA-ND and LBESA under different scenarios for distributed workloads, it can adversely affect the scheduling decisions for some task sets we considered. / Master of Science
29

Utility Accrual Real-Time Scheduling Under Variable Cost Functions

Balli, Umut 15 August 2005 (has links)
We present a utility accrual real-time scheduling algorithm called CIC-VCUA, for tasks whose execution times are functions of their starting times. We model such variable execution times employing variable cost functions (or VCFs). The algorithm considers application activities that are subject to time/utility function time constraints (or TUFs), execution times described using VCFs, and concurrent, mutually exclusive sharing of non-CPU resources. We consider the multi-criteria scheduling objective of (1) assuring that the maximum interval between any two consecutive, successful completions of jobs of a task must not exceed a specified upper bound, and (2) maximizing the system's total accrued utility, while satisfying mutual exclusion resource constraints. Since the scheduling problem is intractable, CIC-VCUA statically computes worst-case sojourn times of tasks, selects tasks for execution based on their potential utility density, and completes them at specific times, in polynomial-time. We establish that CIC-VCUA achieves optimal timeliness during under-loads. Further, we identify the conditions under which timeliness assurances hold. Our simulation experiments illustrate CIC-VCUA's effectiveness and superiority. / Master of Science
30

'Transition Phase' water supply interventions in low-income urban settlements, Kenya

Chakava, Yolanda January 2013 (has links)
A multitude of transitional water supply and distribution interventions are continually piloted in Kenya’s fast-growing urban settlements to meet national and global MDG targets, yet visible problems persist regardless of the investments made. This research evaluates the performance of four interventions led by public utilities and non- governmental organisations in the low-income settlements of Nairobi, Kisumu and Nakuru counties. To understand the service improvement received by the residents, this study used qualitative data from interviews and focus group discussions and quantitative data from 1,168 household surveys. Service level analysis results showed making water more affordable using pre-paid technology reduced the effective price by 75% and increased consumption per household by 20 litres per day, resulting in the highest service progress. Improving water accessibility for the very poor via hosepipe door-step delivery reduced the burden on women carrying water by 43% although efforts failed to reduce the pricing structure, limiting the progress. Subsidised ‘first-time’ metered plot connections to increase the utility customer base experienced shortages in water supply and reluctance from landlords, restricting development. Despite showing no positive change, 81% of residents continued to rely on expensive self-supplied boreholes which were all contaminated. Although the utilities have made positive strides in service improvement, in the context of universal service this study has shown that the very poor remain the most difficult to access, forming the target of discrete interventions that experience difficulties in influencing a reliable supply, sustained price reduction and/or good water quality – essentially what is needed most. In investigating the longer term supply and demand shortfall, this study concludes that the equitable supply and innovative distribution of point source groundwater, with a bias for the poorest, could be the most resilient transitional solution for the utility to promote in the foreseeable future, out of necessity rather than desire.

Page generated in 0.0499 seconds