• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 63
  • 1
  • Tagged with
  • 227
  • 85
  • 71
  • 44
  • 30
  • 30
  • 27
  • 26
  • 26
  • 22
  • 22
  • 21
  • 21
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Utilizing agent based simulation and game theory techniques to optimize an individual’s survival decisions during an epidemic

James, Matthew King January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd Easton / History has shown that epidemics can occur at random and without warning — devastating the populations which they impact. As a preventative measure, modern medicine has helped to reduce the number of diseases that can instigate such an event, nevertheless natural and man-made disease mutations place us continuously at risk of such an outbreak. As a second line of defense, extensive research has been conducted to better understand spread patterns and the efficacy of various containment and mitigation strategies. However, these simulation models have primarily focused on minimizing the impact to groups of people either from an economic or societal perspective and little study has been focused on determining the utility maximizing strategy for an individual. Therefore, this work explores the decisions of individuals to determine emergent behaviors and characteristics which lead to increased probability of survival during an epidemic. This is done by leveraging linear program optimization techniques and the concept of Agent Based Simulation, to more accurately capture the complexity inherent in most real-world systems via the interactions of individual entities. This research builds on 5 years of study focused on rural epidemic simulation, resulting in the development of a 4,000-line computer code simulation package. This adaptable simulation can accurately model the interactions of individuals to discern the impact of any general disease type, and can be implemented on the population of any contiguous counties within Kansas. Furthermore, a computational study performed on the 17 counties of northwestern Kansas provides game theoretical based insights as to what decisions increase the likelihood of survival. For example, statistically significant findings suggest that an individual is four times more likely to become infected if they rush stores for supplies after a government issued warning instead of remaining at home. This work serves as a meaningful step in understanding emergent phenomena during an epidemic which, subsequently, provides novel insight to an individual’s utility maximizing strategy. Understanding the main findings of this research could save your life.
132

Suns: a new class of facet defining structures for the node packing polyhedron

Irvine, Chelsea Nicole January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Graph theory is a widely researched topic. A graph contains a set of nodes and a set of edges. The nodes often represent resources such as machines, employees, or plant locations. Each edge represents the relationship between a pair of nodes such as time, distance, or cost. Integer programs are frequently used to solve graphical problems. Unfortunately, IPs are NP-hard unless P = NP, which implies that it requires exponential effort to solve them. Much research has been focused on reducing the amount of time required to solve IPs through the use of valid inequalities or cutting planes. The theoretically strongest cutting planes are facet defining cutting planes. This research focuses on the node packing problem or independent set problem, which is a combinatorial optimization problem. The node packing problem involves coloring the maximum number of nodes such that no two nodes are adjacent. Node packings have been applied to airline traffic and radio frequencies. This thesis introduces a new class of graphical structures called suns. Suns produce previously undiscovered valid inequalities for the node packing polyhedron. Conditions are provided for when these valid inequalities are proven to be facet defining. Sun valid inequalities have the potential to more quickly solve node packing problems and could even be extended to general integer programs through conflict graphs.
133

Octanary branching algorithm

Bailey, James Patrick January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Integer Programs (IP) are a class of discrete optimization that have been used commercially to improve various systems. IPs are often used to reach an optimal financial objective with constraints based upon resources, operations and other restrictions. While incredibly beneficial, IPs have been shown to be NP-complete with many IPs remaining unsolvable. Traditionally, Branch and Bound (BB) has been used to solve IPs. BB is an iterative algorithm that enumerates all potential integer solutions for a given IP. BB can guarantee an optimal solution, if it exists, in finite time. However, BB can require an exponential number of nodes to be evaluated before terminating. As a result, the memory of a computer using BB can be exceeded or it can take an excessively long time to find the solution. This thesis introduces a modified BB scheme called the Octanary Branching Algorithm (OBA). OBA introduces eight children in each iteration to more effectively partition the feasible region of the linear relaxation of the IP. OBA also introduces equality constraints in four of the children in order to reduce the dimension of the remaining nodes. OBA can guarantee an optimal solution, if it exists, in finite time. In addition, OBA has been shown to have some theoretical improvements over traditional BB. During computational tests, OBA was able to find the first, second and third integer solution with 64.8%, 27.9% and 29.3% fewer nodes evaluated, respectively, than CPLEX. These integers were 44.9%, 54.7% and 58.2% closer to the optimal solution, respectively, when compared to CPLEX. It is recommended that commercial solvers incorporate OBA in the initialization and random diving phases of BB.
134

The Development and Evaluation of a Model of Time-of-arrival Uncertainty

Hooey, Becky 13 April 2010 (has links)
Uncertainty is inherent in complex socio-technical systems such as in aviation, military, and surface transportation domains. An improved understanding of how operators comprehend this uncertainty is critical to the development of operations and technology. Towards the development of a model of time of arrival (TOA) uncertainty, Experiment 1 was conducted to determine how air traffic controllers estimate TOA uncertainty and to identify sources of TOA uncertainty. The resulting model proposed that operators first develop a library of speed and TOA profiles through experience. As they encounter subsequent aircraft, they compare each vehicle’s speed profile to their personal library and apply the associated estimate of TOA uncertainty. To test this model, a normative model was adopted to compare inferences made by human observers to the corresponding inferences that would be made by an optimal observer who had knowledge of the underlying distribution. An experimental platform was developed and implemented in which subjects observed vehicles with variable speeds and then estimated the mean and interval that captured 95% of the speeds and TOAs. Experiments 2 and 3 were then conducted and revealed that subjects overestimated TOA intervals for fast stimuli and underestimated TOA intervals for slow stimuli, particularly when speed variability was high. Subjects underestimated the amount of positive skew of the TOA distribution, particularly in slow/high variability conditions. Experiment 3 also demonstrated that subjects overestimated TOA uncertainty for short distances and underestimated TOA uncertainty for long distances. It was shown that subjects applied a representative heuristic by selecting the trained speed profile that was most similar to the observed vehicle’s profile, and applying the TOA uncertainty estimate of that trained profile. Multiple regression analyses revealed that the task of TOA uncertainty estimation contributed the most to TOA uncertainty estimation error as compared to the tasks of building accurate speed models and identifying the appropriate speed model to apply to a stimulus. Two systematic biases that account for the observed TOA uncertainty estimation errors were revealed: Assumption of symmetry and aversion to extremes. Operational implications in terms of safety and efficiency for the aviation domain are discussed.
135

Validating Integrated Human Performance Models Involving Time-critical Complex Systems

Gore, Brian 29 April 2010 (has links)
The current research sets out to demonstrate a comprehensive approach to validate complex human performance models as applied to time-sensitive tasks. This document is divided into 4 sections. Section 1 (Chapters 1 – 3) outlines previous efforts in the literature that have attempted to validate complex human performance models in the field with an emphasis on manual control models, task network models, cognitive models and integrated architectures. Section 2 (Chapters 4 – 7) elaborates on a validation approach and applies it to a baseline model of a complex task in the air traffic control domain. Section 3 (Chapters 7-12) outlines the importance of adopting an iterative model development-model validation process and reports on the three model iterations in an attempt to improve the validity of the baseline model. Each model augmentation was validated using the same validation approach and measures that were defined in Section 2. Section 4 (Chapters 13-14) provides a discussion and interpretation of the model results and highlights contributions to the field of both model validation and the field of human performance modelling of complex systems.
136

Prostate Cancer Websites: One Size Does Not Fit All

Witteman, Holly 05 September 2012 (has links)
A North American man has approximately a one in six chance of being diagnosed with prostate cancer in his lifetime. In most cases, there is no clearly optimal treatment, so he may be invited to participate in a treatment decision between several medically reasonable options, each with potential short- and long-term side effects. Information needs are high at diagnosis and can continue to be elevated for years or decades. Many men and their families seek information online, where, due partly to the array of websites available and high variation in information preferences, it can be difficult to find personally relevant and useful websites. This research sought to address this issue by developing methods to categorize prostate cancer websites and exploring quantitative and qualitative relationships between websites, information-seekers, and individuals’ assessments of websites. The research involved a series of three studies. In the first study, 29 men with prostate cancer participated in a needs assessment involving questionnaires, an interview, and interaction with a prototype website. In the second study, a detailed classification system was developed and applied to a set of forty websites selected to be representative of the variety of prostate cancer websites available. The third (online) study collected clinical, cognitive, and psychosocial details from 65 participants along with their ratings of websites from study two. A number of hypotheses were tested. One finding was that, compared to men with greater trust, men with lower trust in their physician tended to judge commercial websites as less relevant and useful, and found websites with descriptions of personal experiences more relevant and useful. Analyses also addressed a number of exploratory questions, including whether website and individual attributes might predict preferences for websites. Using discriminant analysis on 80% of the data, two functions were identified that predicted ratings significantly better than chance. These relationships were then validated with 20% of the data held back for testing. The results are discussed in terms of their implications for information tailoring and recommender systems for prostate cancer patients searching for information online. Limitations of the current research and recommendations for future research are also presented.
137

Interventions to Mitigate the Effects of Interruptions During High-risk Medication Administration

Prakash, Varuna 13 January 2011 (has links)
Research suggests that interruptions are ubiquitous in healthcare settings and have a negative impact on patient safety. However, there is a lack of solutions to reduce harm arising from interruptions. Therefore, this research aimed to design and test the effectiveness of interventions to mitigate the effects of interruptions during medication administration. A three-phased study was conducted. First, direct observation was conducted to quantify the state of interruptions in an ambulatory unit where nurses routinely administered high-risk medications. Secondly, a user-centred approach was used to design interventions targeting errors arising from these interruptions. Finally, the effectiveness of these interventions was evaluated through a high-fidelity simulation experiment. Results showed that medication administration error rates decreased significantly on 4 of 7 measures with the use of interventions, compared to the control condition. Results of this work will help guide the implementation of interventions in nursing environments to reduce medication errors caused by interruptions.
138

Analysis of Make(Repair)-to-stock Queues with State-dependent Arrival Rates

Liang, William Kun 14 December 2011 (has links)
In this thesis, we study the repair shop scheduling problem(repair-to-stock) and the production/inventory system pricing and production scheduling problem(make-to-stock). For both types of problems, we compare the performance of different scheduling policies. For the make-to-stock type problem, we also study the performance of different pricing strategies. The optimal repair/production scheduling policy of both problems is difficult to characterize, and, therefore, is only formulated as a Markov Decision Process to numerically compute the optimal cost/profit. As an alternative, we propose the dynamic Myopic policy, which is easy to implement. The numerical study we have conducted demonstrates that the performance of Myopic policy is superior compared to the alternative policies and yields costs very close to the optimal for the repair-to-stock type problem. On the other hand, for the make-to-stock type problems, the performance of Myopic policy is not superior compared to the alternative policies when dynamic pricing strategy is implemented.
139

The Effects of Distractions and Driver's Age on the Type of Crash and the Injury Severity Sustained by Occupants Involved in a Crash

Zishu, Liu 31 July 2012 (has links)
This thesis investigates the associations between crash outcomes, the existence and type of driver distraction as well as driver’s age. The crash outcomes considered in this thesis consist of the type of crash as well as the injury severity sustained by occupants involved in the crash. An ordered logit model was built to predict the likelihood of severe injuries and a multinomial model was developed to predict the likelihood that a driver will be involved in one of three common crash types: singular, angular, and rearend. In these models, various factors (e.g., weather, driver’s gender, and speeding) have been statistically controlled for, but the main focus was on the interaction of driver’s age and distraction type. The findings of this thesis have implications for policy making and prioritizing capabilities of distraction-related safety systems.
140

Interventions to Mitigate the Effects of Interruptions During High-risk Medication Administration

Prakash, Varuna 13 January 2011 (has links)
Research suggests that interruptions are ubiquitous in healthcare settings and have a negative impact on patient safety. However, there is a lack of solutions to reduce harm arising from interruptions. Therefore, this research aimed to design and test the effectiveness of interventions to mitigate the effects of interruptions during medication administration. A three-phased study was conducted. First, direct observation was conducted to quantify the state of interruptions in an ambulatory unit where nurses routinely administered high-risk medications. Secondly, a user-centred approach was used to design interventions targeting errors arising from these interruptions. Finally, the effectiveness of these interventions was evaluated through a high-fidelity simulation experiment. Results showed that medication administration error rates decreased significantly on 4 of 7 measures with the use of interventions, compared to the control condition. Results of this work will help guide the implementation of interventions in nursing environments to reduce medication errors caused by interruptions.

Page generated in 0.0161 seconds