• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 20
  • 8
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 134
  • 134
  • 134
  • 134
  • 29
  • 16
  • 16
  • 15
  • 15
  • 15
  • 15
  • 14
  • 14
  • 14
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Validation and application of a model of human decision making for human/computer communication

Revesman, Mark E. January 1983 (has links)
Decision making in a parallel human/computer system is considered. In this type of system, those tasks for which the computer identical to has the decision making responsibility tasks for which the human are has responsibility. For optimal system performance, it is crucial that the human and computer avoid redundant actions. The traditional method of avoiding redundancies is to have the human engage in an. explicit dialogue with the computer. This method adds an additional task for the human. An alternative method which does not increase workload is to provide the computer with a model of human decision making. If this model is accurate, the computer could predict the actions of the human and avoid those actions which are redundant. The mathematical development of such a predictive model is presented. The model suggested has two stages. The first stage uses discriminant analysis to describe human event detection behavior. The output from the first stage of the model is a vector of "event detected" probabilities, each entry in the vector representing a different system task. The second stage of the model uses dynamic programming to determine the optimal action at a specific point in time. The output from this stage of the model is the appropriate action for the human to take. Two experiments were presented to validate the first and second stage of the model, respectively. The experimental situation depicted a sheet metal plant in which the subjects were to monitor machines for failures. The first stage of the model predicted over 80% of the actions correctly, while the entire model predicted nearly 85% correctly. In the third experiment, the computer was implemented as a parallel decision maker. A significant improvement in performance was observed when the computer based decisions on a model of human decision making vs. when the model was ignored. A modeling approach is suggested as a reasonable alternative to explicit human/computer systems. communication in the design of Further research is suggested to determine the situations in which model based communication would be preferable to dialogue based communication. / Ph. D.
92

A unified decision analysis framework for robust system design evaluation in the face of uncertainty

Duan, Chunming 06 June 2008 (has links)
Some engineered systems now in use are not adequately meeting the needs for which they were developed, nor are they very cost-effective in terms of consumer utilization. Many problems associated with unsatisfactory system performance and high life-cycle cost are the direct result of decisions made during early phases of system design. To develop quality systems, both engineering and management need fundamental principles and methodologies to guide decision making during system design and advanced planning. In order to provide for the efficient resolution of complex system design decisions involving uncertainty, human judgments, and value tradeoffs, an efficient and effective decision analysis framework is required. Experience indicates that an effective approach to improving the quality of detail designs is through the application of Genichi Taguchi's philosophy of robust design. How to apply Taguchi's philosophy of robust design to system design evaluation at the preliminary design stage is an open question. The goal of this research is to develop a unified decision analysis framework to support the need for developing better system designs in the face of various uncertainties. This goal is accomplished by adapting and integrating statistical decision theory, utility theory, elements of the systems engineering process, and Taguchi's philosophy of robust design. The result is a structured, systematic methodology for evaluating system design alternatives. The decision analysis framework consists of two parts: (1) decision analysis foundations, and (2) an integrated approach. Part I (Chapters 2 through 5) covers the foundations for design decision analysis in the face of uncertainty. This research begins with an examination of the life cycle of engineered systems and identification of the elements of the decision process of system design and development. After investigating various types of uncertainty involved in the process of system design, the concept of robust design is defined from the perspective of system life-cycle engineering. Some common measures for assessing the robustness of candidate system designs are then identified and examined. Then the problem of design evaluation in the face of uncertainty is studied within the context of decision theory. After classifying design decision problems into four categories, the structure of each type of problem in terms of sequence and causal relationships between various decisions and uncertain outcomes is represented by a decision tree. Based upon statistical decision theory, the foundations for choosing a best design in the face of uncertainty are identified. The assumptions underlying common objective functions in design optimization are also investigated. Some confusion and controversy which surround Taguchi's robust design criteria — loss functions and signal-to-noise ratios -- are addressed and clarified. Part Il (Chapters 6 through 9) covers models and their application to design evaluation in the face of uncertainty. Based upon the decision analysis foundations, an integrated approach is developed and presented for resolving beth discrete decisions, continuous decisions, and decisions involving both uncertainty and multiple attributes. Application of the approach is illustrated by two hypothetical examples: bridge design and repairable equipment population system design. / Ph. D.
93

Effects of decomposition level on the intrarater reliability of multiattribute alternative evaluation

Cho, Young Jin 06 June 2008 (has links)
A common approach for evaluating complex multiattributed choice alternatives is judgment decomposition: the alternatives are decomposed into a number of value-relevant attributes, the decision maker evaluates each alternative with respect to each attribute, and those single-attribute evaluations are aggregated across the attributes by a formal composition rule. One primary assumption behind decomposition is that it would produce a more reliable outcome than direct holistic evaluations. Although there is some empirical evidence that decomposed procedures can improve the reliability of evaluations, the extent of decomposition can have a considerable effect on the resulting evaluations. This research investigated, theoretically and experimentally, the effects of decomposition level on intrarater reliability in multiattribute alternative evaluation. In a theoretical study, using an additive value composition model with random variables, the composite variance of alternative evaluation was analyzed with respect to the level of decomposition. The composite variance of decomposed evaluation was derived from the variances in the components recomposed using a Statistical method of error propagation. By analyzing the composite variance as a function of the number of attributes used, possible effects of decomposition level were predicted and explained. The analysis showed that the variance of an alternative evaluation is a decreasing function with respect to the level of decomposition, in most cases, and that the marginal reduction of variance diminishes as decomposition level increases. In an experimental study, intrarater test-retest Convergence was examined for a job evaluation with different levels of decomposition. Subjects evaluated six hypothetical job alternatives using four levels of decomposition that ranged from a single overall evaluation to evaluations on twelve highly specific attributes. Intrarater convergence was measured by mean absolute deviations and Pearson correlations between the evaluation scores in two identical sessions separated by two weeks. The mean absolute deviations decreased significantly with respect to the decomposition levels while the Pearson correlations were not significant. Further analyses indicated that the mean absolute deviations decreased with a diminishing rate of reduction, as the decomposition level increased. The research results suggest that decomposition reduces the variability of each alternative evaluation, in most situations. The results, however, also suggest that decomposition may not improve the consistency of preference order of the alternatives that is often important in practical choice decisions. / Ph. D.
94

Temporal changes in marketing mix effectiveness

Andrews, Rick L. 28 July 2008 (has links)
This research develops hypotheses to explain temporal changes in the effectiveness of marketing mix variables. Three potential explanations for these changes in market response are explored: (1) changes in market response associated with industry evolution, (2) trends in market response which may be related to changes in consumer knowledge and familiarity with products over time, and (3) changes in market response associated with changes in consumer incomes. In addition, this research investigates (4) changes in the relative effectiveness of marketing mix variables over time. The hypotheses are tested on time series data from five U.S. industries as well as aggregate U.S. consumption data. To estimate temporal changes in market price sensitivity, advertising effectiveness, and distribution effectiveness, a structural time series modeling methodology is used, and numerical optimization procedures are used to perform maximum likelihood estimation. The results show mixed support for the hypothesis that market response is related to the level of industry maturity. Problems with the indicators of industry maturity were identified which may be partly responsible for the mixed results. Consistent with expectations, this study shows that advertising effectiveness does appear to decline over time, while market price sensitivity and distribution effectiveness increase. Consequently, price reduCtions and increases in distribution coverage appear to become relatively more effective than increases in advertising expenditures over time. There appears to be no relationship between marketing mix effectiveness and consumer incomes. / Ph. D.
95

Data-driven Decision-making: New Insights on Algorithm Performance and Data Value

Mouchtaki, Omar January 2024 (has links)
With the rise of data-driven algorithms, both industrial practitioners and academicians have aimed at understanding how one can use past information to make better future decisions. This question is particularly challenging, as any answer necessarily depends on several parameters, such as the features of the data used (e.g., the quantity and relevance of data), the downstream problem being solved, and the type of algorithms deployed to leverage the data. Most of the current literature analyzes the value of data by anchoring their methods in the large data regime, making the implicit assumption that data is widely available in practice. In this work, we depart from this implicit assumption and posit that, in fact, relevant data is a scarce resource in many practical settings. For instance, data is usually aggregated across different times, product categories, and geographies, and therefore the effective size of datasets is orders of magnitude lower than it may appear to be. The goal of this thesis is to bridge the gap between the theoretical understanding of data-driven decisions and practical performance by developing a problem-centric theory of data-driven decision-making in which we assess the value of data by quantifying its impact on our downstream decisions. In particular, we design methodological tools tailored to the problem at hand and derive fine-grained and problem-specific guarantees for algorithms. In the first chapter, we study the data-driven newsvendor problem under the modeling assumption that data is identically and independently distributed. We are interested in analyzing central policies in the literature, such as Sample Average Approximation (SAA), along with optimal ones, and in characterizing the performance achievable across data sizes, both small and large. Specifically, we characterize exactly the performance of SAA and uncover novel fundamental insights on the value of data. Indeed, our analysis reveals that tens of samples are sufficient to perform very efficiently, but also that more data can lead to worse out-of-sample performance for SAA. In turn, we derive an optimal algorithm in the minimax sense, enhancing decision quality with limited data. The second chapter explores the impact of data relevance on decision quality, addressing the challenge of using historical data from varying sources that may not be fully indicative of the future. We quantify the performance of SAA in these heterogeneous environments and design rate-optimal policies in settings where SAA falters. We illustrate the versatility of our framework by analyzing several prototypical problems across various fields: the newsvendor, pricing, and ski rental problems. Our analysis shows that the type of achievable asymptotic performance varies significantly across different problem classes and heterogeneity notions. Finally, the third chapter develops a framework for contextual decision-making, examining how past data relevance and quantity affect policy performance. Focusing on the contextual newsvendor problem, we analyze the wide class of Weighted Empirical Risk Minimization (WERM) policies, which weigh past data according to their relevance. This class of policies includes the SAA policy (also referred to as ERM), k-Nearest Neighbors, and kernel-based methods. While past literature focuses on upper bounds via concentration inequalities, we instead take an optimization approach and isolate a structure in the newsvendor loss function that allows us to reduce the infinite-dimensional optimization problem over worst-case distributions to a simple line search. In addition to this methodological contribution, our exact analysis offers new granular insights into the learning curve of algorithms in contextual settings. Through these contributions, the thesis advances our understanding of data-driven decision-making, offering both theoretical foundations and practical insights for diverse operational applications.
96

Multi-response simulation optimization using stochastic genetic within a goal programming framework

Baesler, Felipe F. 01 July 2000 (has links)
No description available.
97

Three essays in collective choice theory

Sprumont, Yves 01 February 2006 (has links)
This dissertation contains three essays at the frontier of social choice theory and the theory of games. In the first essay, we consider the problem of dividing a fixed quantity of a perfectly divisible good among n individuals with single-peaked preferences. We show that the properties of Strategy-proofness, Efficiency, and either Anonymity or No Envy, together characterize a unique solution which we call the uniform allocation rule: everyone gets his best choice within the limits of an upper and a lower bound that are common to all individuals and determined by the feasibility constraint. We further analyze the structure of the class of all strategy-proof allocation rules. The second essay explores the idea of Population Monotonicity in the framework of cooperative games. An allocation scheme for a cooperative game specifies how to allocate the worth of every coalition. It is population monotonic if each player's payoff increases as the coalition to which he belongs grows larger. We show that, essentially, a game has a population monotonic allocation scheme (PMAS) if and only if it is a positive linear combination of monotonic simple games with veto control. A dual characterization is also provided. Sufficient conditions for the existence of a PMAS include convexity and "increasing average marginal contributions". If the game is convex, its (extended) Shapley value is a PMAS. The third essay considers the problem of two individuals who must jointly choose one from a finite set of alternatives. We argue that more consensus should not hurt: the closer your preferences are to mine, the better I should like the selected alternative. Two classes of Pareto optimal choice rules -- called generalized maximin" and "choosing-by-veto" rules -- are shown to satisfy this principle. If we strengthen Pareto Optimality along the lines of Suppes' grading principle, the only choice rules satisfying our condition are "simple" maximin rules. / Ph. D.
98

Design of an automated decision support system for scheduling tasks in a generalized job-shop

Bester, Margarete Joan 04 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2006. / Please refer to full text for abstract
99

ZERO/ONE DECISION PROBLEMS WITH MULTIPLE RESOURCE CONSTRAINTS: ALGORITHMS AND APPLICATIONS.

RASSENTI, STEPHEN. January 1982 (has links)
Two complex resource allocation problems motivate the algorithms and applications discussed in this dissertation. The Public Broadcasting Service (PBS), a cooperative of television stations with independent budgets, must decide which programs to purchase from various producers and at what cost to its member stations. The airports of America must decide how to allocate limited takeoff and landing slots to competing airlines. Both problems are recognized as zero/one decision problems with multiple resource constraints. A computer aided allocation mechanism is proposed as an alternative to the currently practiced decision procedures. Bid information, solicited in an auction phase, provides values to parameterize a mathematical model. An optimization phase is then used to generate the best solution for the given information. The integer programming algorithms required to solve the particular models suggested are explored in detail. A best bound enumeration strategy which uses a surrogate knapsack relaxation is developd. Computer storage requirements are curtailed by using a new greedy heuristic for general integer programming problems. The PBS model has a structure closely related to certain fixed charge problems. This allows the use of necessary conditions for the existence of a solution of capacitated transportation problems to test the feasibility of candidate solution vectors. In the SLOT model feasibility testing is a trivial matter of maintaining running row sums. The bound provided by the knapsack relaxation is further enhanced with the addition of a set of generalized choice constraints. An efficient polynomial algorithm and proof of optimality are given for the linear relaxation of this problem. A procedure for generating a set of generalized choice constraints from any set of logical constraints is also given. The viability of the approach developed and the effects of parameter variation are computationally tested in both PBS and SLOT contexts. Some further computational results for project selection, set covering, and multiple knapsack problems are reported. A broad class of mixed integer linear programming problems is defined (e.g., capital expenditure and network design problems) and a suitable relaxation for a similar approach is developed. Finally, several new directions for research in algorithmic development and application are proposed.
100

Online banking investment decision with real option pricing analysis.

January 2001 (has links)
Chu Chun-fai, Carlin. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references (leaves 69-73). / Abstracts in English and Chinese. / Chapter Part I: --- INTRODUCTION --- p.1 / Chapter Part II: --- LITERATURE REVIEW --- p.4 / Chapter - --- Financial option-pricing theory / Chapter - --- Real option-pricing theory / Chapter - --- Real option-pricing theory in Management Information System Area / Chapter Part III: --- CASE BACKGROUND --- p.14 / Chapter - --- Case Background / Chapter - --- Availability of online banking services in Hong Kong / Chapter - --- Online banking investment in the Hong Kong Chinese Bank / Chapter Part IV: --- RESEARCH MODEL --- p.19 / Chapter - --- Research model / Chapter - --- Modelling of the optimal timing problem of HKCB / Chapter - --- Justification of geometric Brownian motion assumption for using Black-Scholes formula / Chapter Part V : --- DATA COLLECTION --- p.30 / Chapter Part VI: --- ANALYSIS RESULT --- p.35 / Chapter - --- Analysis result / Chapter - --- Sensitivity analysis on the selected parameters / Chapter - --- Suggested investment timing / Chapter Part VII: --- DISCUSSIONS AND IMPLICATIONS --- p.44 / Chapter - --- Result discussion / Chapter - --- Implications for researchers / Chapter - --- Implications for practitioners / Chapter Part VIII: --- LIMITATIONS AND CONTRIBUTIONS --- p.48 / Chapter - --- Limitation on data collection process / Chapter - --- Limitations on Black-Scholes model / Chapter - --- Contributions / APPENDIX / Appendix A -Limitation of traditional Discounted Cash Flow analysis --- p.51 / Appendix B -Banks services available to the customers --- p.54 / Appendix C -Sample path of a Geometric Brownian Motion --- p.56 / Appendix D -Discounted Cash Flows analysis of immediate entry of online banking investment --- p.57 / Appendix E -Black-Scholes formula and its interpretation for non-traded --- p.61 / Appendix F -Questionnaire for Online banking investment --- p.64 / Appendix G -Availability of online banking services in May 2001 --- p.67 / Appendix H -Sensitivity analysis on the number of initial usage --- p.68 / Appendix I -Reference List --- p.69

Page generated in 0.1147 seconds