• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 63
  • 1
  • Tagged with
  • 227
  • 85
  • 71
  • 44
  • 30
  • 30
  • 27
  • 26
  • 26
  • 22
  • 22
  • 21
  • 21
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

System Performance Analysis Considering Human-related Factors

Kiassat, Ashkan Corey 08 August 2013 (has links)
All individuals are unique in their characteristics. As such, their positive and negative contributions to system performance differ. In any system that is not fully automated, the effect of the human participants has to be considered when one is interested in the performance optimization of the system. Humans are intelligent, adaptive, and learn over time. At the same time, humans are error-prone. Therefore, in situations where human and hardware have to interact and complement each other, the system faces advantages and disadvantages from the role the humans play. It is this role and its effect on performance that is the focus of this dissertation. When analyzing the role of people, one can focus on providing resources to enable the human participants to produce more. Alternatively, one can strive to ensure the occurrence of less frequent and impactful errors. The focus of the analysis in this dissertation is the latter. Our analysis can be categorized into two parts. In the first part of our analysis, we consider a short term planning horizon and focus directly on failure risk analysis. What can be done about the risk stemming from the human participant? Any proactive steps that can be taken will have the advantage of reducing risk, but will also have a cost associated with it. We develop a cost-benefit analysis to enable a decision-maker to choose the optimal course of action for revenue maximization. We proceed to use this model to calculate the minimum acceptable level of risk, and the associated skill level, to ensure system profitability. The models developed are applied to a case study that comes from a manufacturing company in Ontario, Canada. In the second part of our analysis, we consider a longer planning horizon and are focused on output maximization. Human learning, and its effect on output, is considered. In the first model we develop, we use learning curves and production forecasting models to optimally assign operators, in order to maximize system output. In the second model we develop, we perform a failure risk analysis in combination with learning curves, to forecast the total production of operators. Similar to the first part of our analysis, we apply the output maximization models to the aforementioned case study to better demonstrate the concepts.
102

A Generic Simulation-based Perioperative Decision Support Tool for Tactical Decision

Sniekers, Daphne 13 August 2013 (has links)
In Canada and around the world, there has been an increased focus on the efficiency, cost and access to health care services. One area of particular focus is surgical procedures, often with government funding and policies focused on reducing wait times through pay for performance and volume target initiatives. In Ontario, an expert panel was assembled to evaluate the current state of surgical processes and provide recommendations to improve access, efficiency and quality. This thesis addresses the panel's recommendation for a simulation-based decision tool to help hospitals inform decisions that can lead to improved access and efficiency. A generalised, simulation based perioperative decision tool is presented that can be used to test a variety of tactical decisions. The generic model has been applied to six hospitals of varying sizes, ranging from large academic centres to small rural community hospitals. The model remains in use at some of the hospitals to regularly inform decisions. The model is also being applied to additional hospital sites. During application of the generic model, challenges in design decisions and validation were encountered. As a result, a series of principles are proposed to guide future generic modelling design and achieving user acceptance. These principles add to the generic simulation modelling and healthcare modelling research fields by laying some groundwork for a formalised approach to designing effective generic simulation models and achieving confidence in results. Finally, the research demonstrates two uses of the generic model: as decision tool and as a demonstrative tool. As a decision tool the model is used to compare numerous potential tactical decision options under consideration. As a demonstrative tool, the model is used to quantify the effect of poor practices on hospital performance. The design of the generic model only considers efficient processes and best practices. When model results are compared to historical performance, decision makers are able to quantify the effect of existing poor practices on their performance and decision making. The tool enables users to base their tactical level decisions on the assumption that good practices and procedures are followed.
103

IInterface Design for an Automated Combat Identifcation System: Displaying Reliability Information

Neyedli, Heather 15 February 2010 (has links)
Users have difficulty relying on automated combat identification aids; however, verbally informing users of the automation reliability has helped them rely on the automation more appropriately. A number of interfaces that displayed automation reliability information in real time were developed and tested. In Experiment I, participants used the interfaces in the IMMERSIVE simulation, a first person shooter game. The results showed that the form of the interface affected both reliance on the automation and sensitivity in discriminating hostile and friendly targets. The difference in sensitivity and reliance may be attributed to how participants allocated their attention among the displays. In Experiment II, still combat scenes were presented to the participants for 400 or 800 milliseconds (as opposed to 10 seconds in Experiment I) to place additional time stress on attention resources. The results replicated the results of Experiment I, but sensitivity measures showed a dependence on reliability of the automation.
104

Semantic Integration of Time Ontologies

Ong, Darren 15 December 2011 (has links)
Here we consider the verification and semantic integration for the set of first-order time ontologies by Allen-Hayes, Ladkin, and van Benthem that axiomatize time as points, intervals, or a combination of both within an ontology repository environment. Semantic integration of the set of time ontologies is explored via the notion of theory interpretations using an automated reasoner as part of the methodology. We use the notion of representation theorems for verification by characterizing the models of the ontology up to isomorphism and proving that they are equivalent to the intended structures for the ontology. Provided is a complete account of the meta-theoretic relationships between ontologies along with corrections to their axioms, translation definitions, proof of representation theorems, and a discussion of various issues such as class-quantified interpretations, the impact of namespacing support for Common Logic, and ontology repository support for semantic integration as related to the time ontologies examined.
105

Quantifying and mitigating decentralized decision making in humanitarian logistics systems

Muggy, Timothy Luke January 1900 (has links)
Doctor of Philosophy / Department of Industrial & Manufacturing Systems Engineering / Jessica L. Heier Stamm / Humanitarian and public health logistics systems are often characterized by decentralized decision makers in the form of response agencies who establish supply chains and the beneficiaries who access them. While classical models assume there is a single decision maker with a global objective and authority, decentralized systems consist of multiple decision makers, each with accomplishing his own objective and scope of control. The literature demonstrates that decentralized systems often perform poorly when compared to their hypothetical centralized counterparts. However, there exist few models in the literature to quantify the impact of decentralization and mechanisms for its mitigation are deficient. This research advances knowledge of decentralized systems through new game theory and optimization models, solution methodologies and theoretical characterizations of system performance. First, the author presents a literature review that synthesizes research regarding the facets of humanitarian operations that can benefit from the application of game theory. The author finds that models of decentralized behavior lack realism, neglecting sources of uncertainty, dynamism and personal preferences that influence individuals' decisions. These findings motivate the remaining components of the thesis. Next, the author focuses on decentralization on the part of response agencies who open service facilities. Decentralization can adversely impact patient access and equity, both critical factors in humanitarian contexts. A dynamic, robust facility location model is introduced to enable a comparison between a given decentralized response and a hypothetical coordinated response using identical resources. The value of the model is demonstrated through a computational study of the response to a recent cholera epidemic. Finally, the author introduces game theory models that represent the decisions of beneficiaries seeking relief. The models account for distance, congestion, and the relative importance an individual places on the two. The author constructs an algorithm that computes a decentralized solution in polynomial time. The author quantifies decentralized system performance in comparison to centralized control, bounding the cost of decentralized decision making for the least and most costly outcomes. The author identifies coordination mechanisms encourage centrally optimal decisions within decentralized systems.
106

A study of multiple attributes decision making methods facing uncertain attributes

Amini, Mohammadhossein January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Shing I. Chang / Many decision-making methods have been developed to help decision makers (DMs) make efficient decisions. One decision making method involves selecting the best choice among alternatives based on a set of criteria. Multiple Attribute Decision-Making (MADM) methods allow opportunities to determine the optimal alternative based on multiple attributes. This research aims to overcome two concerns in current MADM methods: uncertainty of attributes and sensitivity of ranking results. Based on availability of information for attributes, a DM maybe certain or uncertain on his judgment on alternatives. Researchers have introduced the use of linguistic terms or uncertain intervals to tackle the uncertainty problems. This study provides an integrated approach to model uncertainty in one of the most popular MADM methods: TOPSIS (Technique for Order Preference by Similarity to Ideal Solution). Current MADM methods also provide a final ranking of alternatives under consideration and, the final solution is based on a calculated number assigned to each alternative. Results have shown that the final value of alternatives may be close to each other uncertain attributes, but current methods rank alternatives according to the final scores. It exhibits a sensitivity issue related to formation of the ranking list. The proposed method solves this problem by simulating random numbers within uncertain intervals in the decision matrix. The proposed outcome is a ranking distribution for alternatives. The proposed method is based on TOPSIS, which defines the best and the worst solution for each attribute and defines the best alternative as closest to best and farthest from the worst solution. Random number distributions were studied under the proposed simulation solution approach. Result showed that triangular random number distribution provides better ranking results than uniform distribution. A case study of building design selection considering resiliency and sustainability attributes was presented to demonstrate use of the proposed method. The study demonstrated that proposed method can provide better decision option for designers due to the ability to consider uncertain attributes. In addition using the proposed method, a DM can observe the final ranking distribution resulted from uncertain attribute values.
107

Consensus in group decision making under linguistic assessments

Chen, Zhifeng January 1900 (has links)
Doctor of Philosophy / Department of Industrial and Manufacturing Systems Engineering / David Ben-Arieh / Group decision-making is an essential activity is many domains such as financial, engineering, and medical fields. Group decision-making basically solicits opinions from experts and combines these judgments into a coherent group decision. Experts typically express their opinion in many different formats belonging to two categories: quantitative evaluations and qualitative ones. Many times experts cannot express judgment in accurate numerical terms and use linguistic labels or fuzzy preferences. The use of linguistic labels makes expert judgment more reliable and informative for decisionmaking. In this research, a new linguistic label fusion operator has been developed. The operator helps mapping one set of linguistic labels into another. This gives decision makers more freedom to choose their own linguistic preference labels with different granularities and/or associated membership functions. Three new consensus measure methods have been developed for group decision making problem in this research. One is a Markov chain based consensus measure method, the other is order based, and the last one is a similarity based consensus measure approach. Also, in this research, the author extended the concept of Ordered Weighted Average (OWA) into a fuzzy linguistic OWA (FLOWA). This aggregation operator is more detailed and includes more information about the aggregate than existing direct methods. After measuring the current consensus, we provide a method for experts to modify their evaluations to improve the consensus level. A cost based analysis gives the least cost suggestion for this modification, and generates a least cost of group consensus. In addition, in this research I developed an optimization method to maximize two types of consensus under a budget constraint. Finally considering utilization of the consensus provides a practical recommendation to the desired level of consensus, considering its cost benefits.
108

Modeling, analysis, and experimental investigations of grinding processes

Li, Zhichao January 1900 (has links)
Doctor of Philosophy / Department of Industrial & Manufacturing Systems Engineering / Zhijian Pei / Grinding is one of the important operations employed in modern manufacturing industry to remove materials and achieve desired geometry and surface finish. Simultaneous double side grinding (SDSG) and ultrasonic vibration assisted grinding (UVAG) are two typical cost-effective grinding processes which are utilized to grind semiconductor materials and high performance ceramic materials, respectively. The objectives of this research are to investigate several technical issues in modern grinding processes by using theoretical, numerical, and experimental research approaches. Those technical issues are related to SDSG and UVAG, which have been chosen as two typical grinding processes for this research. This thesis reviews the literature on SDSG (covering process applications, modeling of grinding marks, and modeling of wafer shapes) and UVAG (covering process applications, edge chipping, and coolant effects, etc). The theoretical research work of this thesis is conducted by developing mathematical models for grinding marks and wafers shapes in SDSG of silicon wafers. These developed models are then used to study the effects of SDSG parameters on the curvature of the grinding marks, the distance between adjacent grinding marks, and the wafer shapes. The numerical research work of this thesis is done by conducting a three dimensional (3-D) finite element analysis (FEA) of UVAG process. A 3-D FEA model is developed to study the edge chipping commonly observed in UVAG of ceramics. Edge chippings not only compromises geometric accuracy but also possibly causes an increase in machining cost. A solution to reduce the edge chipping is proposed based upon the FEA simulations and validated by pilot experiments. Several experimental studies are conducted to provide new knowledge for the UVAG process. Firstly, a novel coolant delivery system is explored for UVAG machine system. Secondly, UVAG is introduced into machining of fiber-reinforced ceramic matrix composites (CMC). Results of a feasibility study and a designed experimental investigation show that UVAG is a promising process for CMC machining. Finally, an experimental study on cutting forces during UVAG of zirconia/alumina composites is conducted. The feasibility to machine different zirconia/alumina composites using UVAG is also investigated and discussed. The findings in this thesis will provide theoretical and practical guidance for modern grinding processes especially for SDSG and UVAG.
109

Applying ergonomics to dental scalers

Ahern, Stacey January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Malgorzata J. Rys / The current state of the dental industry shows an increasing number of dentists and dental hygienists who are reducing hours and retiring early due to the injuries sustained while working. These injuries, or cumulative trauma disorders, can be reduced by applying ergonomics in dental tool design. The goal of ergonomics is to reduce current injuries but also prevent future ones. In addition, population demographics have shown an increasing trend in female dentists. With a shift from the male dominated field, design for different anthropometric measurements needs to be investigated. In order to pinpoint sources of pain, a survey was designed and distributed to dentists in Kansas, Missouri, and Texas. Even with a small sample size (n=24), results confirmed past studies in the dental industry of pain originating in the neck, shoulder, lower back, and wrist/hand region. The reasons stemmed from the repetitive motions and forces applied during dental procedures. Responses also found that ergonomic principles need to be applied to the handle and grip portion of dental scaler design. Dental scaling is the procedure to remove deposits on teeth, such as plaque and calculus, most commonly performed by dental hygienists. First, the history of dental tools, angulation, tool weight, and materials currently utilized were researched before looking into specific design factors for modification. Currently, the handle grip area on all dental tools range in size, but a 10 mm grip has been proven to be optimal. The optimal tool weight has yet to be determined as 15 grams is the lowest weight to be tested. Most tools are made of stainless steel and resins, which are not compressible. An experiment was designed to test a new dental scaler (A) made of a titanium rod with added compressibility in the precision grip area. The aim was to help reduce pressure on the fingers and hand muscles and increase comfort during scaling. The experiment utilized a Hu-Friedy sickle scaler (B) and a Practicon Montana Jack scaler (C) as controls to show two design spectrums, weight and material. The subjects (n=23) were taught the basics of scaling and required to scale using a typodont. The change in grip strength (Δ GS), pinch strength (Δ PS), and steadiness of the subjects hand were tested. An absolute and relative rating technique was utilized pinpointing that the new dental scaler was preferred with the eigenvector (A=0.8615, B=0.1279, C=0.0106). Statistical analysis confirmed this tool preference while also finding the interaction of gender and tool and Δ GS Tool A versus Tool B for males to be significant.
110

Generating an original Cutting-plane Algorithm in Three Sets

Harris, Andrew William January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Integer programs (IP) are a commonly researched class of problems used by governments and businesses to improve decision making through optimal resource allocation and scheduling. However, integer programs require an exponential amount of effort to solve and in some instances a feasible solution is unknown even with the most powerful computers. There are several methods commonly used to reduce the solution time for IPs. One such approach is to generate new valid inequalities through lifting. Lifting strengthens a valid inequality by changing the coefficients of the variables in the inequality. Lifting can result in facet defining inequalities, which are the theoretically strongest inequalities. This thesis introduces the Cutting-plane Algorithm in Three Sets (CATS) that can help reduce the solution time of integer programs. CATS uses synchronized simultaneous lifting to generate a new class of previously undiscovered valid inequalities. These inequalities are based upon three sets of indices from a binary knapsack integer program, which is a commonly studied integer program. CATS requires quartic effort times the number of inequalities generated. Some theoretical results describe easily verifiable conditions under which CATS inequalities are facet defining. A small computational study shows CATS obtains about an 8.9% percent runtime improvement over a commercial IP software. CATS preprocessing time is fast and requires an average time of approximately .032 seconds to perform. With the exciting new class of inequalities produced relatively quickly compared to the solution time, CATS is advantageous and should be implemented to reduce solution time of many integer programs.

Page generated in 0.0155 seconds