• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 129
  • 39
  • 11
  • 11
  • 8
  • 7
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 272
  • 272
  • 121
  • 103
  • 51
  • 46
  • 34
  • 34
  • 32
  • 31
  • 30
  • 30
  • 30
  • 29
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

GIS model for the Land Use and Development Master Plan in Rwanda

Tims, Willem January 2009 (has links)
<p>This thesis was aimed at the development of a Geographical Information System (GIS) based model to support the Rwanda Land Use and Development Master Plan. Developing sustainable land management is the main task of this master plan. Stakeholder’s involvement was of key importance. Their demands should be analysed and visualised to support discussions and the decision-making process. Spatial Multicriteria Decision Analysis (MCDA) is a proven method for land-use planning purposes. However, most land-use planning applications focus on a specific theme, such as urban development. In addition, land-use planning is often limited to a relatively small area. This thesis focused at the development of a countrywide GIS model, containing all land-uses accommodated in three main land-use categories: urban, agriculture and conservation. The GIS model was largely based on the Land-Use Conflict Identification Strategy (LUCIS) model. Many of the goals, objectives, and subobjectives that described the earlier mentioned land-use categories were adopted from the original model. However, a significant number of them were dropped, and new were created to suit the Rwandan situation. Stakeholder’s involvement was realized by assigning weights to the goals and preference maps. The Analytical Hierarchy Process (AHP) was used as weighting method. ESRI’s ArcGIS ModelBuilder was used to give the model shape in the GIS. Firstly, suitability maps were created of all elements in the model. The suitability maps were then transformed into preference maps by weighting them. In the next step the preference maps were collapsed in three classes: low, medium and high preference. Finally, the preference maps of the three land-use categories were combined, in order to visualize conflict areas. Ortho photos proved to be useful when acting as reference for the suitability and preference maps. Despite a large number of missing datasets, the GIS model was executed to simplify the understanding. However, many of the obtained results were unreliable because of the incompleteness of datasets, and can therefore not be used for decision-making.  Unfortunately, due to the stage of the project it was not possible to obtain weights from the stakeholders, and should therefore be done when the time is right. Right Choice DSS, a very user-friendly decision support application, was proposed to use for calculating weights. To conclude, the developed GIS model integrated countrywide land-use suitability mapping and stakeholders’ wishes that can be used for discussions and decision making.</p>
62

Comparison of fuzzy and crisp analytic hierarchy process (AHP) methods for spatial multicriteria decision analysis in GIS

Kordi, Maryam January 2008 (has links)
<p>There are a number of decision making problems in which Geographical Information System (GIS) has employed to organize and facilitate the procedure of analyzing the problem. These GIS-based decision problems which typically include a number of different criteria and alternatives are generally analyzed by Multicriteria Decision Analysis (MCDA).Different locations within a geographical area represent the alternatives by which the overall goal of the project is achieved. The quality of achieving the goal is evaluated by a set of criteria which should be considered in the work. Analytic Hierarchy Process (AHP) which is a powerful method of MCDA generally can organize spatial problems and decides which alternatives are most suitable for the defined problems. However due to some intrinsic uncertainty in the method, a number of authors suggest fuzzifying the method while others are against fuzzification of the AHP.</p><p>The debate over fuzzifying AHP is going on and attempt for finding that was mostly in theory, and little, if any; practical comparison between the AHP and fuzzified AHP has done. This work presents a practical comparison of AHP and fuzzy AHP in a GIS-based problem, case study, for locating a dam in Costa Rica, considering different criteria. In order to perform the AHP and fuzzy AHP in the GIS-based problem and calculating weights of the criteria by the methods, some computer codes have written and developed in MATLAB.</p><p>The comparisons between the AHP and fuzzy AHP methods are done on result weights and on the result final maps. The comparison between the weights is repeated on different levels of uncertainty in fuzzy AHP then all the results are compared with the result of AHP method. Also this study for checking the effect of fuzzification on results is suggested Chi-Square test as a suitable tool.</p><p>Comparisons between the resulting weights of the AHP and fuzzy AHP methods show some differences between the methods. Furthermore, the Chi-Square test shows that the higher level of uncertainty in the fuzzy AHP, the greater the difference in results between the AHP and fuzzy AHP methods.</p>
63

A Systemic Approach Framework for Operational Risk : – SAFOR –

Kessler, Anna-Maria January 2007 (has links)
This thesis attempts to describe the essential systems features of a complex real-world domain of operational risk (OR) in banking, by employing general systems theory (GST) as the guiding method. An implementational framework (SAFOR) is presented for operational risk management (ORM), the target of which is to manage and mitigate the risk-around-loss causes. Since reasoning about OR is often scenario based, the framework also includes methods for decision making in addition to Value at Risk (VaR) and Conditional Value at Risk (CVaR). Other computational models that yield prediction intervals are discussed as well. Because the banking industry is one of the most mature sectors when it comes to OR, and contains the most data points, the discussion in this thesis evolves around such institutions. The present state-of-the-art in OR management for banking is surveyed using a systemic-holistic approach and the model framework is presented against this discussion. Tools and concepts from systems theory and systems thinking are employed for assessing systems properties and gaining insights into the interaction of various components. This brings about a number of advantages. This is not in disagreement with current suggestions such as those of the Basle Committee (Basel II), which is doing an excellent job in proving the state-of-the-art in best practice for banking institutions. Rather, this thesis offers a complementary perspective, looking at essentially the same problems but in a broader context and with a differing view. OR data has been hard to come by in banking. Confidentiality and difficulties in quantifying OR as well as the short time data has been gathered in a consistent way are some of the reasons for this. Therefore, no case study has been done. Instead, we have chosen to look into a published bank application of an advanced OR model. The application shows that the technique holds as validation of the SAFOR modules.
64

An applied approach to numerically imprecise decision making

Idefeldt, Jim January 2007 (has links)
Despite the fact that unguided decision making might lead to inefficient and nonoptimal decisions, decisions made at organizational levels seldom utilise decisionanalytical tools. Several gaps between the decision-makers and the computer baseddecision tools exist, and a main problem in managerial decision-making involves the lack of information and precise objective data, i.e. uncertainty and imprecision may be inherent in the decision situation. We believe that this problem might be overcome by providing computer based decision tools capable of handling the uncertainty inherent in real-life decision-making. At present, nearly all decision analytic software is only able to handle precise input, and no known software is capable of handling full scale imprecision, i.e. imprecise probabilities, values and weights, in the form of interval and comparative statements. There are, however, some theories which are able to handle some kind of uncertainty, and which deal with computational and implementational issues, but if they are never actually operationalised, they are of little real use for a decision-maker. Therefore, a natural question is how a reasonable decision analytical framework can be built based on prevailing interval methods, thus dealing with the problems of uncertain and imprecise input? Further, will the interval approach actually prove useful? The framework presented herein handles theoretical foundations for, and implementations of, imprecise multi-level trees, multi-criteria, risk analysis, together with several different evaluation options. The framework supports interval probabilities, values, and criteria weights, as well as comparative statements, also allowing for mixing probabilistic and multi-criteria decisions. The framework has also been field tested in a number of studies, proving the usefulness of the interval approach.
65

Supporting Public Health Policy Decision-making through Economic Evaluation: Applications and Methods

Sander, Beate 11 January 2012 (has links)
The extent to which economic evaluations of public health programs in Ontario are conducted and used by decision makers is currently very limited. This thesis supports public health decision-making through applied and methodological work. The applied work demonstrates different methods to evaluate the cost-effectiveness of public health interventions using the examples of seasonal and pandemic influenza immunization programs. The methodological component explores whether time horizon choice, one methodological consideration in economic evaluations, introduces bias. The economic evaluation of Ontario’s universal influenza immunization program (UIIP) uses primarily provincial health administrative databases to assess the impact of UIIP on health outcomes (quality-adjusted life years (QALYs), mortality), health care resource use (physician office visits, emergency department visits, and hospitalizations), and costs due to seasonal influenza. Ontario’s UIIP was found to be cost-effective compared to a targeted program. The economic evaluation of Ontario’s H1N1 (2009) mass immunization program uses a mathematical modeling approach to describe the pandemic as observed in Ontario. By removing immunization from the simulation, the impact of the program was evaluated. Outcome measures include health outcomes (attack rate, deaths, QALYs), resource use, and cost (physician office visits, emergency department visits, hospitalizations). The analysis found Ontario’s mass immunization program to be highly cost-effective despite high program cost. The methodological component investigates whether time horizon choice, a major methodological choice, introduces bias to economic evaluations. The existence, magnitude and direction of time horizon bias are demonstrated using a formal model. This work supports current guidelines in recommending a lifetime time horizon and provides a framework to discuss bias in economic evaluations. This thesis demonstrates different approaches to evaluate the cost-effectiveness of public health interventions, informs decision-making, and establishes the groundwork to guide future economic evaluations of public health interventions.
66

Supporting Public Health Policy Decision-making through Economic Evaluation: Applications and Methods

Sander, Beate 11 January 2012 (has links)
The extent to which economic evaluations of public health programs in Ontario are conducted and used by decision makers is currently very limited. This thesis supports public health decision-making through applied and methodological work. The applied work demonstrates different methods to evaluate the cost-effectiveness of public health interventions using the examples of seasonal and pandemic influenza immunization programs. The methodological component explores whether time horizon choice, one methodological consideration in economic evaluations, introduces bias. The economic evaluation of Ontario’s universal influenza immunization program (UIIP) uses primarily provincial health administrative databases to assess the impact of UIIP on health outcomes (quality-adjusted life years (QALYs), mortality), health care resource use (physician office visits, emergency department visits, and hospitalizations), and costs due to seasonal influenza. Ontario’s UIIP was found to be cost-effective compared to a targeted program. The economic evaluation of Ontario’s H1N1 (2009) mass immunization program uses a mathematical modeling approach to describe the pandemic as observed in Ontario. By removing immunization from the simulation, the impact of the program was evaluated. Outcome measures include health outcomes (attack rate, deaths, QALYs), resource use, and cost (physician office visits, emergency department visits, hospitalizations). The analysis found Ontario’s mass immunization program to be highly cost-effective despite high program cost. The methodological component investigates whether time horizon choice, a major methodological choice, introduces bias to economic evaluations. The existence, magnitude and direction of time horizon bias are demonstrated using a formal model. This work supports current guidelines in recommending a lifetime time horizon and provides a framework to discuss bias in economic evaluations. This thesis demonstrates different approaches to evaluate the cost-effectiveness of public health interventions, informs decision-making, and establishes the groundwork to guide future economic evaluations of public health interventions.
67

An examination of the ethical decision-making processes used in decisions to fund, reduce or cease funding tailored health services

Evoy, Brian 05 1900 (has links)
Health authority administrators were interviewed for their perspectives on what makes a good health care system; on tailored population-specific services as a way to address health inequities; and on how they perceive themselves to be making good funding decisions on the public’s behalf. The qualitative descriptive research dataset includes 24 hour-and-a-half long interviews with administrators from four BC health authorities, health region documents, memos, and field notes. Participants support the continuation of a public health care system and all participants acknowledge using tailored services as a route towards reducing health inequities. However, these identified services have not been evaluated for their overall effectiveness. When it comes to decision-making, participants describe using a series of governance and bioethical principles that help them frame what and how issues can be considered. Decision situations are framed in a way that informs them whether they need to use formal or informal processes. In both cases participants collect information that allows others to understand that they have made wise decisions. The Recognition-Primed Decision Model accurately reflects the intuitive processes that participants describe using during informal decision-making and portions of formal decision-making. However, in relation to formal decision situations, there is less alignment with existing Decision-Analysis literature. Seven practice and future research recommendations are provided: 1. Increase health authority participation in intersectoral partnerships that address non-medical determinants of health. 2. Develop new strategies for addressing health inequities. 3. Evaluate the efficacy of using tailored services beyond their ability to remove barriers to access. In addition, increase focus on testing new strategies for reducing the inequities gap. 4. Enhance existing decision-making processes by including the explicit review of decision tradeoffs, value weighting, and mechanisms for requesting revisions. 5. Focus future research on developing and evaluating the usefulness of formal decision-making tools in health authority structures and their relation to decision latitude. 6. Launch a longitudinal research study that examines how health authority expert decision-makers use judgmental heuristics and how they avoid the negative effects of bias. 7. Commission public dialogue on shifting the current illness-based system to one that is wellness based.
68

The Verification of Probabilistic Forecasts in Decision and Risk Analysis

Jose, Victor Richmond January 2009 (has links)
<p> Probability forecasts play an important role in many decision and risk analysis applications. Research and practice over the years have shown that the shift towards distributional forecasts provides a more accurate and appropriate means of capturing risk in models for these applications. This means that mathematical tools for analyzing the quality of these forecasts, may it come from experts, models or data, become important to the decision maker. In this regard, strictly proper scoring rules have been widely studied because of their ability to encourage assessors to provide truthful reports. This dissertation contributes to the scoring rule literature in two main areas of assessment - probability forecasts and quantile assessments. </p><p>In the area of probability assessment, scoring rules typically studied in the literature, and commonly used in practice, evaluate probability assessments relative to a default uniform measure. In many applications, the uniform baseline used to represent some notion of ignorance is inappropriate. In this dissertation, we generalize the power and pseudospherical family of scoring rules, two large parametric families of commonly-used scoring rules, by incorporating the notion of a non-uniform baseline distribution for both the discrete and continuous cases. With an appropriate normalization and choice of parameters, we show that these new families of scoring rules relate to various well-known divergence measures from information theory and to well-founded decision models when framed in an expected utility maximization context. </p><p>In applications where the probability space considered has an ordinal ranking between states, an important property often considered is sensitivity to distance. Scoring rules with this property provide higher scores to assessments that allocate higher probability mass to events “closer” to that which occurs based on some notion of distance. In this setting, we provide an approach that allows us to generate new sensitive to distance strictly proper scoring rules from well-known strictly proper binary scoring rules. Through the use of the weighted scoring rules, we also show that these new scores can incorporate a specified baseline distribution, in addition to being strictly proper and sensitive to distance. </p><p>In the inverse problem of quantile assessment, scoring rules have not yet been well-studied and well-developed. We examine the differences between scoring rules for probability and quantile assessments, and demonstrate why the tools that have been developed for probability assessments no longer encourage truthful reporting when used for quantile assessments. In addition, we shed light on new properties and characterizations for some of these rules that could guide decision makers trying to choosing an appropriate scoring rule. </p> / Dissertation
69

Managing Uncertainty in Engineering Design Using Imprecise Probabilities and Principles of Information Economics

Aughenbaugh, Jason Matthew 22 June 2006 (has links)
The engineering design community recognizes that an essential part of the design process is decision making. Because decisions are generally made under uncertainty, engineers need appropriate methods for modeling and managing uncertainty. Two important characteristics of uncertainty in the context of engineering design are imprecision and irreducible uncertainty. In order to model both of these characteristics, it is valuable to use probabilities that are most generally imprecise and subjective. These imprecise probabilities generalize traditional, precise probabilities; when the available information is extensive, imprecise probabilities reduce to precise probabilities. An approach for comparing the practical value of different uncertainty models is developed. The approach examines the value of a model using the principles of information economics: value equals benefits minus costs. The benefits of a model are measured in terms of the quality of the product that results from the design process. Costs are measured not only in terms of direct design costs, but also the costs of creating and using the model. Using this approach, the practical value of using an uncertainty model that explicitly recognizes both imprecision and irreducible uncertainty is demonstrated in the context of a high-risk engineering design example in which the decision-maker has few statistical samples to support the decision. It is also shown that a particular imprecise probability model called probability bounds analysis generalizes sensitivity analysis, a process of identifying whether a particular decision is robust given the decision makers lack of complete information. An approach for bounding the value of future statistical data samples while collecting information to support design decisions is developed, and specific policies for making decisions in the presence of imprecise information are examined in the context of engineering.
70

Modeling and Analysis of Complex Technology Adoption Decisions: An Investigation in the Domain of Mobile ICT

Basole, Rahul C. 10 July 2006 (has links)
Mobile information and communication technologies (ICT) promise to significantly transform enterprises, their business processes and services, improve employee productivity, effectiveness, and efficiency, and create new competitive advantages and business agility. Despite the plethora of potential benefits, however, widespread enterprise adoption of mobile ICT has not been as extensive as initially anticipated. Drawing on the extant information systems, technology management, and organizational innovation literature, this dissertation investigates the salient drivers and inhibitors of emerging ICT adoption, in general, and mobile ICT in particular, and develops an integrative ICT adoption decision framework. From this synthesis we identify four broad elements that influence an enterprise s decision to adopt mobile ICT: (1) business value, (2) costs and economics, (3) strategic alignment, and (4) enterprise readiness. The latter decision element has received only little theoretical and practical attention. In order to fill this gap, this dissertation explored the concept of enterprise readiness in further detail and identified eight key dimensions and their associated assessment indicators. Using a two-stage expert study and experimental design approach, we empirically validated these dimensions and determined their relative importance. Results indicated that leadership readiness followed by technology, data and information, and resource readiness, contributed the most to enterprise readiness for mobile ICT. The results are implemented into a web-based readiness diagnostic tool (RDT) that enables decision makers to assess an enterprise s readiness for mobile ICT. The benefits of the RDT are multifold: first, it navigates the decision maker through the complex readiness assessment space; second, it identifies potential organizational deficiencies and provides a means to assess potential sources of risks associated with the adoption and implementation of mobile ICT; and third, it enables decision makers to benchmark their level of readiness against other organizations. The dissertation concludes by highlighting both theoretical and practical implications for emerging and mobile ICT adoption management and suggesting directions for future research.

Page generated in 0.0956 seconds