• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 63
  • 1
  • Tagged with
  • 227
  • 85
  • 71
  • 44
  • 30
  • 30
  • 27
  • 26
  • 26
  • 22
  • 22
  • 21
  • 21
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

The optimal exercising problem from American options: a comparison of solution methods

DeHaven, Sara January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Chih-Hang Wu / The fast advancement in computer technologies in the recent years has made the use of simulation to estimate stock/equity performances and pricing possible; however, determining the optimal exercise time and prices of American options using Monte-Carlo simulation is still a computationally challenging task due to the involved computer memory and computational complexity requirements. At each time step, the investor must decide whether to exercise the option to get the immediate payoff, or hold on to the option until a later time. Traditionally, the stock options are simulated using Monte-Carlo methods and all stock prices along the path are stored, and then the optimal exercise time is determined starting at the final time period and continuing backward in time. Also, as the number of paths simulated increases, the number of simultaneous equations that need to be solved at each time step grow proportionally. Currently, two theoretical methods have emerged in determining the optimal exercise problem. The first method uses the concept of least-squares approach in linear regression to estimate the value of continuing to hold on to the option via a set of randomly generated future stock prices. Then, the value of continuing can be compared to the payoff at current time from exercising the option and a decision can be reached, which gives the investor a higher value. The second method uses the finite difference approach to establish an exercise boundary for the American option via an artificially generated mesh on both possible stock prices and decision times. Then, the stock price is simulated and the method checks to see if it is inside the exercise boundary. In this research, these two solution approaches are evaluated and compared using discrete event simulation. This allows complex methods to be simulated with minimal coding efforts. Finally, the results from each method are compared. Although a more conservative method cannot be determined, the least-squares method is faster, more concise, easier to implement, and requires less memory than the mesh method. The motivation for this research stems from interest in simulating and evaluating complicated solution methods to the optimal exercise problem, yet requiring little programming effort to produce accurate and efficient estimation results.
202

Experimental and numerical investigation of laser assisted milling of silicon nitride ceramics

Yang, Budong January 1900 (has links)
Doctor of Philosophy / Department of Industrial & Manufacturing Systems Engineering / Shuting Lei / This study experimentally and numerically investigates laser assisted milling (LAMill) of silicon nitride ceramics. Experiments are conducted to study the machinability of Si3N4 under LAMill. The effects of temperature on cutting forces, tool wear, surface integrity, edge chipping and material removal mechanisms are investigated. It is shown that when temperature increases, cutting force and tool wear are significantly decreased, surface integrity is improved, chip size is increased and material removal demonstrates more plastic characteristics. The mechanisms of edge chipping at elevated temperature are investigated theoretically and experimentally. When temperature is above the softening point and below the brittle/ductile transition temperature, the mechanism is mainly through softening. When temperature is above the brittle/ductile transition temperature, toughening mechanism contributes significantly to the reduced edge chipping. The coupled effect of softening and toughening mechanisms shows that temperature range between 1200 to 1400°C has the most significant effect to reduce edge chipping. Distinct element method (DEM) is applied to simulate the micro-mechanical behavior of Si3N4. First, quantitative relationships between particle level parameters and macro-properties of the bonded particle specimens are obtained, which builds a foundation for simulation of Si3N4. Then, extensive DEM simulations are conducted to model the material removal of machining Si3N4. The simulation results demonstrate that DEM can reproduce the conceptual material removal model summarized from experimental observations, including the initiation and propagation of cracks, chip formation process and material removal mechanisms. It is shown that material removal is mainly realized by propagation of lateral cracks in machining of silicon nitride. At the elevated temperature under laser assisted machining, lateral cracks are easier to propagate to form larger machined chips, there are fewer and smaller median cracks therefore less surface/subsurface damage, and crushing-type material removal is reduced. The material removal at elevated temperature demonstrates more plastic characteristics. The numerical results agree very well with experimental observations. It shows that DEM is a promising method to model the micro-mechanical process of machining Si3N4.
203

Rotary ultrasonic machining of hard-to-machine materials

Churi, Nikhil January 1900 (has links)
Doctor of Philosophy / Department of Industrial & Manufacturing Systems Engineering / Zhijian Pei / Titanium alloy is one of the most important materials used in major segments of industries such as aerospace, automobile, sporting goods, medical and chemical. Market survey has stated that the titanium shipment in the USA has increased significantly in last two decades, indicating its increased usage. Industries are always under tremendous pressure to meet the ever-increasing demand to lower cost and improve quality of the products manufactured from titanium alloy. Similar to titanium alloys, silicon carbide and dental ceramics are two important materials used in many applications. Rotary ultrasonic machining (RUM) is a non-traditional machining process that combines the material removal mechanisms of diamond grinding and ultrasonic machining. It comprises of a tool mounted on a rotary spindle attached to a piezo-electric transducer to produce the rotary and ultrasonic motion. No study has been reported on RUM of titanium alloy, silicon carbide and dental ceramics. The goal of this research was to provide new knowledge of machining these hard-to-machine materials with RUM for further improvements in the machining cost and surface quality. A thorough research has been conducted based on the feasibility study, effects of tool variables, effects of machining variables and wheel wear mechanisms while RUM of titanium alloy. The effects of machining variables (such as spindle speed, feed rate, ultrasonic vibration power) and tool variables (grit size, diamond grain concentration, bond type) have been studied on the output variables (such as cutting force, material removal rate, surface roughness, chipping size) and the wheel wear mechanisms for titanium alloy. Feasibility of machining silicon carbide and dental ceramics is also conducted along with a designed experimental study.
204

Modeling the effect of resident learning curve in the emergency department

Richards, Robert Michael January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Chih-Hang John Wu / The University of Kansas Medical Center’s Emergency Department is adopting a new residency program. In the past, generalized Residents have supported attending physicians during a required three month rotation in the Emergency Department. As of July 2010, the University of Kansas Medical Center’s Emergency Department has switched to a dedicated Emergency Medicine Residency program that allows recently graduated physicians the opportunity enter the field of Emergency Medicine. This thesis shows that although not initially a dedicated residency program provides an advantage to the Emergency Department. Discrete Event Simulations have been used to predict changes in processes, policies, and practices in many different fields. The models run quickly, and can provide a basis for future actions without the cost of actually implementing changes in policies or procedures. This thesis applies a learning curve in a Simulation Model in order to provide data that the University of Kansas Medical Center’s Emergency Department can utilize to make decisions about their new Residency Program. A generalized learning curve was used for the base model and compared to all alternatives. When it was compared with an alternative curve following a Sigmoid Function (Logistic Function), there were no significant differences. Ultimately, a Gompertz Curve is suggested for hospitals attempting to develop or improve their residency programs using learning curves because it is easily fitted to their desired shape. This thesis shows the effect that Residents have on the performance of the Emergency Department as a whole. The two major components examined for the generalized learning curve were the initial position for first year residents determined by the variable [alpha], and the shape of the curve determined by the variable [beta]. Individual changes the value of [alpha] had little effect. Varying values of [beta] have shown that smaller values elongate the shape of the curve, prolonging the amount of time it takes for a resident to perform at the level of the attending physician. Each resident’s personal value of [beta] can be used to evaluate the performance in the emergency department. Resident’s who’s [beta] value are smaller the emergency department’s expected value might have trouble performing.
205

Impact of decentralized decision making on access to cholera treatment in Haiti

Moore, Brian D. January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Jessica L. Heier Stamm / In many humanitarian and public health settings, multiple organizations act independently to locate facilities to serve an affected population. As a result of this decentralized decision-making environment, individuals’ access to facility resources may suffer in comparison to a hypothetical system in which a single planner locates the facilities to optimize access for all. Furthermore, due to the unanticipated nature of humanitarian events and the urgency of the need, responders often must cope with a high level of uncertainty regarding the future supply of resources and demand for relief. The contributions of this thesis address the challenges that arise due to the decentralized and dynamic nature of humanitarian response. The first goal of this research is to quantify the difference between decentralized system performance and that possible with a centralized planner. The second goal is to demonstrate the value and feasibility of using a dynamic, rolling-horizon framework to optimize facility location decisions over time. This work compares individuals’ access to health facilities resulting from location decisions made by decentralized decision-makers to the access achieved by a centralized model that optimizes access for all. Access is measured using a special case of the gravity model, the Enhanced Two-Step Floating Catchment Area (E2SFCA) method, which is a distance-weighted ratio of capacity to demand. The E2SFCA method is integrated with integer programming to optimize public access to health facilities. This method is applied to the location of cholera treatment facilities in Haiti, which has been afflicted with a cholera epidemic since October 2010. This research finds that access varied significantly across Haiti, and in the month of February 2011, thirty-seven of the 570 sections, representing 474,286 persons (4.8 percent of the population), did not have adequate access to cholera treatment facilities. Using centralized models to optimize accessibility, performance can be improved but no single model is dominant. This paper recommends use of an efficiency-oriented model in conjunction with an equity constraint to make facility location decisions in future responses. Finally, this work successfully integrates measures of access and equity into a rolling-horizon facility location model and demonstrates that these measures can be incorporated in a full-scale implementation to provide dynamic decision support to planners. This paper advocates for greater awareness of the impact of decentralization in humanitarian response and recommends that future work be undertaken to discover incentives and strategies to mitigate the impact of decentralization in future responses.
206

Axiomatized Relationships between Ontologies

Chui, Carmen 21 November 2013 (has links)
This work focuses on the axiomatized relationships between different ontologies of varying levels of expressivity. Motivated by experiences in the decomposition of first-order logic ontologies, we partially decompose the Descriptive Ontology for Linguistic and Cognitive Engineering (DOLCE) into modules. By leveraging automated reasoning tools to semi-automatically verify the modules, we provide an account of the meta-theoretic relationships found between DOLCE and other existing ontologies. As well, we examine the composition process required to determine relationships between DOLCE modules and the Process Specification Language (PSL) ontology. Then, we propose an ontology based on the semantically-weak Computer Integrated Manufacturing Open System Architecture (CIMOSA) framework by augmenting its constructs with terminology found in PSL. Finally, we attempt to map two semantically-weak product ontologies together to analyze the applications of ontology mappings in e-commerce.
207

Axiomatized Relationships between Ontologies

Chui, Carmen 21 November 2013 (has links)
This work focuses on the axiomatized relationships between different ontologies of varying levels of expressivity. Motivated by experiences in the decomposition of first-order logic ontologies, we partially decompose the Descriptive Ontology for Linguistic and Cognitive Engineering (DOLCE) into modules. By leveraging automated reasoning tools to semi-automatically verify the modules, we provide an account of the meta-theoretic relationships found between DOLCE and other existing ontologies. As well, we examine the composition process required to determine relationships between DOLCE modules and the Process Specification Language (PSL) ontology. Then, we propose an ontology based on the semantically-weak Computer Integrated Manufacturing Open System Architecture (CIMOSA) framework by augmenting its constructs with terminology found in PSL. Finally, we attempt to map two semantically-weak product ontologies together to analyze the applications of ontology mappings in e-commerce.
208

Optimal Control and Estimation of Stochastic Systems with Costly Partial Information

Kim, Michael J. 31 August 2012 (has links)
Stochastic control problems that arise in sequential decision making applications typically assume that information used for decision-making is obtained according to a predetermined sampling schedule. In many real applications however, there is a high sampling cost associated with collecting such data. It is therefore of equal importance to determine when information should be collected as it is to decide how this information should be utilized for optimal decision-making. This type of joint optimization has been a long-standing problem in the operations research literature, and very few results regarding the structure of the optimal sampling and control policy have been published. In this thesis, the joint optimization of sampling and control is studied in the context of maintenance optimization. New theoretical results characterizing the structure of the optimal policy are established, which have practical interpretation and give new insight into the value of condition-based maintenance programs in life-cycle asset management. Applications in other areas such as healthcare decision-making and statistical process control are discussed. Statistical parameter estimation results are also developed with illustrative real-world numerical examples.
209

Evaluating Customer Service Representative Staff Allocation and Meeting Customer Satisfaction Benchmarks: DEA Bank Branch Analysis

Min, Elizabeth Jeeyoung 14 December 2011 (has links)
This research employs a non-parametric, fractional, linear programming method, Data Envelopment Analysis to examine the Customer Service Representative resource allocation efficiency of a major Canadian bank’s model. Two DEA models are proposed, (1) to evaluate the Bank’s national branch network in the context of employment only, by minimizing Full Time Equivalent (FTE) while maximizing over-the-counter (OTC) transaction volume; and (2) to evaluate the efficacy of the Bank’s own model in meeting the desired customer satisfaction benchmarks by maximizing fraction of transactions completed under management’s target time. Non-controllable constant-returns-to-scale and variable-returns to-scale model results are presented and further broken down into branch size segments and geographical regions for analysis. A comparison is conducted between the DEA model results and the Bank’s performance ratios and benchmarks, validating the use of the proposed DEA models for resource allocation efficiency analysis in the banking industry.
210

The Use of Simulation Methods to Understand and Control Pandemic Influenza

Michael, Beeler 20 November 2012 (has links)
This thesis investigates several uses of simulation methods to understand and control pandemic influenza in urban settings. An agent-based simulation, which models pandemic spread in a large metropolitan area, is used for two main purposes: to identify the shape of the distribution of pandemic outcomes, and to test for the presence of complex relationships between public health policy responses and underlying pandemic characteristics. The usefulness of pandemic simulation as a tool for assessing the cost-effectiveness of vaccination programs is critically evaluated through a rigorous comparison of three recent H1N1 vaccine cost-effectiveness studies. The potential for simulation methods to improve vaccine deployment is then demonstrated through a discrete-event simulation study of a mass immunization clinic.

Page generated in 0.0365 seconds