• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 63
  • 20
  • 8
  • 4
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 130
  • 130
  • 130
  • 130
  • 29
  • 16
  • 15
  • 15
  • 15
  • 15
  • 14
  • 14
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Validation and application of a model of human decision making for human/computer communication

Revesman, Mark E. January 1983 (has links)
Decision making in a parallel human/computer system is considered. In this type of system, those tasks for which the computer identical to has the decision making responsibility tasks for which the human are has responsibility. For optimal system performance, it is crucial that the human and computer avoid redundant actions. The traditional method of avoiding redundancies is to have the human engage in an. explicit dialogue with the computer. This method adds an additional task for the human. An alternative method which does not increase workload is to provide the computer with a model of human decision making. If this model is accurate, the computer could predict the actions of the human and avoid those actions which are redundant. The mathematical development of such a predictive model is presented. The model suggested has two stages. The first stage uses discriminant analysis to describe human event detection behavior. The output from the first stage of the model is a vector of "event detected" probabilities, each entry in the vector representing a different system task. The second stage of the model uses dynamic programming to determine the optimal action at a specific point in time. The output from this stage of the model is the appropriate action for the human to take. Two experiments were presented to validate the first and second stage of the model, respectively. The experimental situation depicted a sheet metal plant in which the subjects were to monitor machines for failures. The first stage of the model predicted over 80% of the actions correctly, while the entire model predicted nearly 85% correctly. In the third experiment, the computer was implemented as a parallel decision maker. A significant improvement in performance was observed when the computer based decisions on a model of human decision making vs. when the model was ignored. A modeling approach is suggested as a reasonable alternative to explicit human/computer systems. communication in the design of Further research is suggested to determine the situations in which model based communication would be preferable to dialogue based communication. / Ph. D.
92

A unified decision analysis framework for robust system design evaluation in the face of uncertainty

Duan, Chunming 06 June 2008 (has links)
Some engineered systems now in use are not adequately meeting the needs for which they were developed, nor are they very cost-effective in terms of consumer utilization. Many problems associated with unsatisfactory system performance and high life-cycle cost are the direct result of decisions made during early phases of system design. To develop quality systems, both engineering and management need fundamental principles and methodologies to guide decision making during system design and advanced planning. In order to provide for the efficient resolution of complex system design decisions involving uncertainty, human judgments, and value tradeoffs, an efficient and effective decision analysis framework is required. Experience indicates that an effective approach to improving the quality of detail designs is through the application of Genichi Taguchi's philosophy of robust design. How to apply Taguchi's philosophy of robust design to system design evaluation at the preliminary design stage is an open question. The goal of this research is to develop a unified decision analysis framework to support the need for developing better system designs in the face of various uncertainties. This goal is accomplished by adapting and integrating statistical decision theory, utility theory, elements of the systems engineering process, and Taguchi's philosophy of robust design. The result is a structured, systematic methodology for evaluating system design alternatives. The decision analysis framework consists of two parts: (1) decision analysis foundations, and (2) an integrated approach. Part I (Chapters 2 through 5) covers the foundations for design decision analysis in the face of uncertainty. This research begins with an examination of the life cycle of engineered systems and identification of the elements of the decision process of system design and development. After investigating various types of uncertainty involved in the process of system design, the concept of robust design is defined from the perspective of system life-cycle engineering. Some common measures for assessing the robustness of candidate system designs are then identified and examined. Then the problem of design evaluation in the face of uncertainty is studied within the context of decision theory. After classifying design decision problems into four categories, the structure of each type of problem in terms of sequence and causal relationships between various decisions and uncertain outcomes is represented by a decision tree. Based upon statistical decision theory, the foundations for choosing a best design in the face of uncertainty are identified. The assumptions underlying common objective functions in design optimization are also investigated. Some confusion and controversy which surround Taguchi's robust design criteria — loss functions and signal-to-noise ratios -- are addressed and clarified. Part Il (Chapters 6 through 9) covers models and their application to design evaluation in the face of uncertainty. Based upon the decision analysis foundations, an integrated approach is developed and presented for resolving beth discrete decisions, continuous decisions, and decisions involving both uncertainty and multiple attributes. Application of the approach is illustrated by two hypothetical examples: bridge design and repairable equipment population system design. / Ph. D.
93

Effects of decomposition level on the intrarater reliability of multiattribute alternative evaluation

Cho, Young Jin 06 June 2008 (has links)
A common approach for evaluating complex multiattributed choice alternatives is judgment decomposition: the alternatives are decomposed into a number of value-relevant attributes, the decision maker evaluates each alternative with respect to each attribute, and those single-attribute evaluations are aggregated across the attributes by a formal composition rule. One primary assumption behind decomposition is that it would produce a more reliable outcome than direct holistic evaluations. Although there is some empirical evidence that decomposed procedures can improve the reliability of evaluations, the extent of decomposition can have a considerable effect on the resulting evaluations. This research investigated, theoretically and experimentally, the effects of decomposition level on intrarater reliability in multiattribute alternative evaluation. In a theoretical study, using an additive value composition model with random variables, the composite variance of alternative evaluation was analyzed with respect to the level of decomposition. The composite variance of decomposed evaluation was derived from the variances in the components recomposed using a Statistical method of error propagation. By analyzing the composite variance as a function of the number of attributes used, possible effects of decomposition level were predicted and explained. The analysis showed that the variance of an alternative evaluation is a decreasing function with respect to the level of decomposition, in most cases, and that the marginal reduction of variance diminishes as decomposition level increases. In an experimental study, intrarater test-retest Convergence was examined for a job evaluation with different levels of decomposition. Subjects evaluated six hypothetical job alternatives using four levels of decomposition that ranged from a single overall evaluation to evaluations on twelve highly specific attributes. Intrarater convergence was measured by mean absolute deviations and Pearson correlations between the evaluation scores in two identical sessions separated by two weeks. The mean absolute deviations decreased significantly with respect to the decomposition levels while the Pearson correlations were not significant. Further analyses indicated that the mean absolute deviations decreased with a diminishing rate of reduction, as the decomposition level increased. The research results suggest that decomposition reduces the variability of each alternative evaluation, in most situations. The results, however, also suggest that decomposition may not improve the consistency of preference order of the alternatives that is often important in practical choice decisions. / Ph. D.
94

Temporal changes in marketing mix effectiveness

Andrews, Rick L. 28 July 2008 (has links)
This research develops hypotheses to explain temporal changes in the effectiveness of marketing mix variables. Three potential explanations for these changes in market response are explored: (1) changes in market response associated with industry evolution, (2) trends in market response which may be related to changes in consumer knowledge and familiarity with products over time, and (3) changes in market response associated with changes in consumer incomes. In addition, this research investigates (4) changes in the relative effectiveness of marketing mix variables over time. The hypotheses are tested on time series data from five U.S. industries as well as aggregate U.S. consumption data. To estimate temporal changes in market price sensitivity, advertising effectiveness, and distribution effectiveness, a structural time series modeling methodology is used, and numerical optimization procedures are used to perform maximum likelihood estimation. The results show mixed support for the hypothesis that market response is related to the level of industry maturity. Problems with the indicators of industry maturity were identified which may be partly responsible for the mixed results. Consistent with expectations, this study shows that advertising effectiveness does appear to decline over time, while market price sensitivity and distribution effectiveness increase. Consequently, price reduCtions and increases in distribution coverage appear to become relatively more effective than increases in advertising expenditures over time. There appears to be no relationship between marketing mix effectiveness and consumer incomes. / Ph. D.
95

Three essays in collective choice theory

Sprumont, Yves 01 February 2006 (has links)
This dissertation contains three essays at the frontier of social choice theory and the theory of games. In the first essay, we consider the problem of dividing a fixed quantity of a perfectly divisible good among n individuals with single-peaked preferences. We show that the properties of Strategy-proofness, Efficiency, and either Anonymity or No Envy, together characterize a unique solution which we call the uniform allocation rule: everyone gets his best choice within the limits of an upper and a lower bound that are common to all individuals and determined by the feasibility constraint. We further analyze the structure of the class of all strategy-proof allocation rules. The second essay explores the idea of Population Monotonicity in the framework of cooperative games. An allocation scheme for a cooperative game specifies how to allocate the worth of every coalition. It is population monotonic if each player's payoff increases as the coalition to which he belongs grows larger. We show that, essentially, a game has a population monotonic allocation scheme (PMAS) if and only if it is a positive linear combination of monotonic simple games with veto control. A dual characterization is also provided. Sufficient conditions for the existence of a PMAS include convexity and "increasing average marginal contributions". If the game is convex, its (extended) Shapley value is a PMAS. The third essay considers the problem of two individuals who must jointly choose one from a finite set of alternatives. We argue that more consensus should not hurt: the closer your preferences are to mine, the better I should like the selected alternative. Two classes of Pareto optimal choice rules -- called generalized maximin" and "choosing-by-veto" rules -- are shown to satisfy this principle. If we strengthen Pareto Optimality along the lines of Suppes' grading principle, the only choice rules satisfying our condition are "simple" maximin rules. / Ph. D.
96

Design of an automated decision support system for scheduling tasks in a generalized job-shop

Bester, Margarete Joan 04 1900 (has links)
Thesis (MSc)--University of Stellenbosch, 2006. / Please refer to full text for abstract
97

ZERO/ONE DECISION PROBLEMS WITH MULTIPLE RESOURCE CONSTRAINTS: ALGORITHMS AND APPLICATIONS.

RASSENTI, STEPHEN. January 1982 (has links)
Two complex resource allocation problems motivate the algorithms and applications discussed in this dissertation. The Public Broadcasting Service (PBS), a cooperative of television stations with independent budgets, must decide which programs to purchase from various producers and at what cost to its member stations. The airports of America must decide how to allocate limited takeoff and landing slots to competing airlines. Both problems are recognized as zero/one decision problems with multiple resource constraints. A computer aided allocation mechanism is proposed as an alternative to the currently practiced decision procedures. Bid information, solicited in an auction phase, provides values to parameterize a mathematical model. An optimization phase is then used to generate the best solution for the given information. The integer programming algorithms required to solve the particular models suggested are explored in detail. A best bound enumeration strategy which uses a surrogate knapsack relaxation is developd. Computer storage requirements are curtailed by using a new greedy heuristic for general integer programming problems. The PBS model has a structure closely related to certain fixed charge problems. This allows the use of necessary conditions for the existence of a solution of capacitated transportation problems to test the feasibility of candidate solution vectors. In the SLOT model feasibility testing is a trivial matter of maintaining running row sums. The bound provided by the knapsack relaxation is further enhanced with the addition of a set of generalized choice constraints. An efficient polynomial algorithm and proof of optimality are given for the linear relaxation of this problem. A procedure for generating a set of generalized choice constraints from any set of logical constraints is also given. The viability of the approach developed and the effects of parameter variation are computationally tested in both PBS and SLOT contexts. Some further computational results for project selection, set covering, and multiple knapsack problems are reported. A broad class of mixed integer linear programming problems is defined (e.g., capital expenditure and network design problems) and a suitable relaxation for a similar approach is developed. Finally, several new directions for research in algorithmic development and application are proposed.
98

Online banking investment decision with real option pricing analysis.

January 2001 (has links)
Chu Chun-fai, Carlin. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references (leaves 69-73). / Abstracts in English and Chinese. / Chapter Part I: --- INTRODUCTION --- p.1 / Chapter Part II: --- LITERATURE REVIEW --- p.4 / Chapter - --- Financial option-pricing theory / Chapter - --- Real option-pricing theory / Chapter - --- Real option-pricing theory in Management Information System Area / Chapter Part III: --- CASE BACKGROUND --- p.14 / Chapter - --- Case Background / Chapter - --- Availability of online banking services in Hong Kong / Chapter - --- Online banking investment in the Hong Kong Chinese Bank / Chapter Part IV: --- RESEARCH MODEL --- p.19 / Chapter - --- Research model / Chapter - --- Modelling of the optimal timing problem of HKCB / Chapter - --- Justification of geometric Brownian motion assumption for using Black-Scholes formula / Chapter Part V : --- DATA COLLECTION --- p.30 / Chapter Part VI: --- ANALYSIS RESULT --- p.35 / Chapter - --- Analysis result / Chapter - --- Sensitivity analysis on the selected parameters / Chapter - --- Suggested investment timing / Chapter Part VII: --- DISCUSSIONS AND IMPLICATIONS --- p.44 / Chapter - --- Result discussion / Chapter - --- Implications for researchers / Chapter - --- Implications for practitioners / Chapter Part VIII: --- LIMITATIONS AND CONTRIBUTIONS --- p.48 / Chapter - --- Limitation on data collection process / Chapter - --- Limitations on Black-Scholes model / Chapter - --- Contributions / APPENDIX / Appendix A -Limitation of traditional Discounted Cash Flow analysis --- p.51 / Appendix B -Banks services available to the customers --- p.54 / Appendix C -Sample path of a Geometric Brownian Motion --- p.56 / Appendix D -Discounted Cash Flows analysis of immediate entry of online banking investment --- p.57 / Appendix E -Black-Scholes formula and its interpretation for non-traded --- p.61 / Appendix F -Questionnaire for Online banking investment --- p.64 / Appendix G -Availability of online banking services in May 2001 --- p.67 / Appendix H -Sensitivity analysis on the number of initial usage --- p.68 / Appendix I -Reference List --- p.69
99

Application of Stochastic Decision Models to Solid Waste Management

Wright, William Ervin 08 1900 (has links)
This research applies stochastic decision tree analytical techniques to a decision of the type a small community may face when choosing a solid waste disposal system from among several alternatives. Specifically targeted are those situations in which a community finds itself (1) lying at or near the boundary of a central planning area, (2) in a position to exercise one of several disposal options, and (3) has access to the data base on solid waste which has been systematically developed by a central planning agency. The options available may or may not be optimal in terms of total cost, either to the community or to adjacent communities which participate in centrally coordinated or jointly organized activities. The study suggests that stochastic simulation models, drawing upon a data base developed by central planning agencies in cases where local data are inadequate or not available, can be useful in evaluating disposal alternatives at the community level. Further, the decision tree can be usefully employed to communicate results of the analysis. Some important areas of further research on the small community disposal system selection problem are noted.
100

Development of a Technology Transfer Score for Evaluating Research Proposals: Case Study of Demand Response Technologies in the Pacific Northwest

Estep, Judith 13 February 2017 (has links)
Investment in Research and Development (R&D) is necessary for innovation, allowing an organization to maintain a competitive edge. The U.S. Federal Government invests billions of dollars, primarily in basic research technologies to help fill the pipeline for other organizations to take the technology into commercialization. However, it is not about just investing in innovation, it is about converting that research into application. A cursory review of the research proposal evaluation criteria suggests that there is little to no emphasis placed on the transfer of research results. This effort is motivated by a need to move research into application. One segment that is facing technology challenges is the energy sector. Historically, the electric grid has been stable and predictable; therefore, there were no immediate drivers to innovate. However, an aging infrastructure, integration of renewable energy, and aggressive energy efficiency targets are motivating the need for research and to put promising results into application. Many technologies exist or are in development but the rate at which they are being adopted is slow. The goal of this research is to develop a decision model that can be used to identify the technology transfer potential of a research proposal. An organization can use the model to select the proposals whose research outcomes are more likely to move into application. The model begins to close the chasm between research and application -- otherwise known as the "valley of death." A comprehensive literature review was conducted to understand when the idea of technology application or transfer should begin. Next, the attributes that are necessary for successful technology transfer were identified. The emphasis of successful technology transfer occurs when there is a productive relationship between the researchers and the technology recipient. A hierarchical decision model, along with desirability curves, was used to understand the complexities of the researcher and recipient relationship, specific to technology transfer. In this research, the evaluation criteria of several research organizations were assessed to understand the extent to which the success attributes that were identified in literature were considered when reviewing research proposals. While some of the organizations included a few of the success attributes, none of the organizations considered all of the attributes. In addition, none of the organizations quantified the value of the success attributes. The effectiveness of the model relies extensively on expert judgments to complete the model validation and quantification. Subject matter experts ranging from senior executives with extensive experience in technology transfer to principal research investigators from national labs, universities, utilities, and non-profit research organizations were used to ensure a comprehensive and cross-functional validation and quantification of the decision model. The quantified model was validated using a case study involving demand response (DR) technology proposals in the Pacific Northwest. The DR technologies were selected based on their potential to solve some of the region's most prevalent issues. In addition, several sensitivity scenarios were developed to test the model's response to extreme case scenarios, impact of perturbations in expert responses, and if it can be applied to other than demand response technologies. In other words, is the model technology agnostic? In addition, the flexibility of the model to be used as a tool for communicating which success attributes in a research proposal are deficient and need strengthening and how improvements would increase the overall technology transfer score were assessed. The low scoring success attributes in the case study proposals (e.g. project meetings, etc.) were clearly identified as the areas to be improved for increasing the technology transfer score. As a communication tool, the model could help a research organization identify areas they could bolster to improve their overall technology transfer score. Similarly, the technology recipient could use the results to identify areas that need to be reinforced, as the research is ongoing. The research objective is to develop a decision model resulting in a technology transfer score that can be used to assess the technology transfer potential of a research proposal. The technology transfer score can be used by an organization in the development of a research portfolio. An organization's growth, in a highly competitive global market, hinges on superior R&D performance and the ability to apply the results. The energy sector is no different. While there is sufficient research being done to address the issues facing the utility industry, the rate at which technologies are adopted is lagging. The technology transfer score has the potential to increase the success of crossing the chasm to successful application by helping an organization make informed and deliberate decisions about their research portfolio.

Page generated in 0.1246 seconds