• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 63
  • 1
  • Tagged with
  • 227
  • 85
  • 71
  • 44
  • 30
  • 30
  • 27
  • 26
  • 26
  • 22
  • 22
  • 21
  • 21
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Size reduction of cellulosic biomass for biofuel manufacturing

Zhang, Meng January 1900 (has links)
Doctor of Philosophy / Department of Industrial & Manufacturing Systems Engineering / Zhijian Pei and Donghai Wang / Currently, transportation is almost entirely dependent on petroleum-based fuels (e.g. gasoline, diesel fuel, and jet fuel). Increasing demands for sustainable sources of liquid transportation fuels make it imperative to develop alternatives to petroleum-based fuels. Biofuels derived from cellulosic biomass (forest and agricultural residues and dedicated energy crops) have been recognized as promising alternatives to petroleum-based liquid fuels. Cellulosic biofuels not only reduce the nation’s dependence on foreign petroleum but also improve the environment through reduction of greenhouse gas emissions. In order to convert cellulosic biomass into biofuels, cellulosic biomass must go through a size reduction step first, because large size cellulosic biomass (whole stems of herbaceous biomass or chunks of woody biomass) cannot be converted to biofuels efficiently with the current conversion technologies. Native cellulosic biomass has limited accessibility to enzyme due to its structural complexity. Size reduction can reduce particle size and disrupt cellulose crystallinity, rendering the substrate more amenable to enzymatic hydrolysis. The purpose of this research is to provide knowledge of how size reduction alters biomass structural features, and understand the relationships between these biomass structural features and enzymatic hydrolysis sugar yield. This research is also aimed to investigate the impacts of process parameters in biomass size reduction on the conversion of cellulosic biomass to biofuels to help realize cost-effective manufacturing of cellulosic biofuels. This dissertation consists of eleven chapters. Firstly, an introduction of this research is given in Chapter 1. Secondly, Chapters 2 presents a literature review on cellulosic biomass size reduction. Thirdly, a preliminary experimental study is included in Chapter 3. Chapters 4 to 6 present a three-phase study on confounding effects of two important biomass structural features: particle size and biomass crystallinity. Chapters 7 and 8 investigate effects of sieve size used in size reduction of woody and herbaceous biomass, respectively. Chapters 9 and 10 focus on the relationship between particle size and sugar yield. Chapter 11 studies effects of cutting orientation in size reduction of woody biomass. Finally, conclusions and contributions are given in Chapter 12.
62

A review of KDOT overhead guide sign lighting policy

Gund, Adity January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Malgorzata J. Rys / Before year 2012 the US States must implement and continue using an assessment or management method to maintain traffic signs’ retro-reflectivity at or above the minimum levels adopted by FHWA. Overhead guide signs are a particular problem because past research has shown that with available sheeting types, external illumination may still be necessary. Newer sheeting types and more energy efficient and cost-effective lighting systems are currently available. Over the next several years there is potential for savings of tens of thousands of dollars if the states don’t have to use external illumination for their overhead guide signs, or, if more energy efficient, low cost lighting systems are used when necessary. A detailed literature review was conducted on minimum retro-reflectivity values for overhead guide signs, the legibility distance under reduced sign luminance and the lighting systems available for external illumination. There is no agreement found in the previous research studies as to what is the optimum or minimum luminance of the guide signs. In addition, a survey was sent to all state DOTs. This survey was focused on finding states’ policies regarding the use of sheeting type and external illumination requirement, and methods and lighting systems for maintaining minimum values for overhead guide sign luminance. The total response rate was 56% (28 out of 50 state DOTs responded). Out of 28 respondents 19 said that they have usage policy for the type of sheeting material used for overhead guide signs and 18 of them provided details about their policy. An equipment to control the illumination levels from vehicle headlamps was built and pilot tests were conducted in the laboratory. A pulse-with-modulation dimming circuit designed around the Atmel ARM-based microcontroller board and power MOSFETs was used to control the brightness of the headlights for the high- and low-beams. Field tests were conducted in the later stages during nighttime to evaluate three different types of sheeting materials using 10 human subjects from age group 18-34. The materials DG3 and Type IV were found to perform better than Type I material for nighttime visibility but they were not significantly different from each other.
63

Statistically monitoring inventory accuracy in large warehouse and retail environments

Huschka, Andrew January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / John English / This research builds upon previous efforts to explore the use of Statistical Process Control (SPC) in lieu of cycle counting. Specifically a three pronged effort is developed. First, in the work of Huschka (2009) and Miller (2008), a mixture distribution is proposed to model the complexities of multiple Stock Keeping Units (SKU) within an operating department. We have gained access to data set from a large retailer and have analyzed the data in an effort to validate the core models. Secondly, we develop a recursive relationship that enables large samples of SKUs to be evaluated with appropriately with the SPC approach. Finally, we present a comprehensive set of type I and type II error rates for the SPC approach to inventory accuracy monitoring.
64

Real-time detection of wave profile changes

Tavakkol, Behnam January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Shing I. Chang / This research studies a few methodologies for real-time detection of wave profile changes. In regular profile monitoring, change detection takes place at the end of time period when a complete profile is available. In real-time change detection of profiles, a potential profile change takes place between the beginning and the end of the time period. The decision involves the identification whether a process is in control or out of control before the entire profile is generated. In this regard, five proposed methodologies were developed and tested in this thesis. Earthquake waves, manufacturing processes, and heart beat rate are a few examples of profiles with different natures that the proposed methodologies can be applied to. Water temperature profiles generated during a curing process are considered as an example in this study. Successful implementation of the proposed work on these profiles would cause saving great amounts of time and money. Five methods are studied for monitoring the water control process of a curing process. The first four proposed methodologies are based on an univariate approach where the statistic used for process monitoring is the enclosed area between the profiles and their fitted cutting lines. A multivariate approach is also proposed. A simulation study is also conducted when the best method is chosen based on it performance and simplicity of operations. Various types of acceptable and unacceptable profiles are simulated for the best proposed method identified in the preliminary study. The best method has a satisfactory performance in detecting the changes in the unacceptable profiles. In addition, the false alarm rate in identifying acceptable profiles as bad profiles is lower than 10%.
65

Supporting Communication Between Nurses and Physicians

Vogwill, Vanessa 26 February 2009 (has links)
Nurses and physicians in General Internal Medicine (GIM) work in a complex environment where patients present with complex (co)morbidities; management of such patients requires input from a full range of medical disciplines. In addition, there is regular resident physician changeover every 8-9 weeks in this teaching environment, and patient “flow” problems caused by overcrowding and placement issues. This complexity causes difficulties in the information exchange between nurses and physicians necessary to manage patient care. Multidisciplinary team meetings have been suggested as helpful to interprofessional communication, and in General Internal Medicine these take place in the form of daily “Bullet Rounds”. More recently the use of process engineering approaches has been suggested as a way to increase efficiency in healthcare; this dissertation evaluates its impact on communication between nurses and physicians. The initial observational field study showed that information exchange was the main focus of dialogue in Bullet Rounds, and identified information gaps between nurses and physicians. Script Theory (Schank and Abelson 1977) was used to propose that information gaps in Bullet Rounds are caused by different knowledge and goals, which result in inconsistent scripts. A process engineering intervention took place in General Internal Medicine. Process engineering methods have been proposed as being helpful in process design and improvement in healthcare but have not been systematically evaluated. The researcher conducted a pre and post intervention study of Bullet Rounds in order to identify and analyse the impacts of a process engineering intervention on information exchange between nurses and physicians. The results showed that information loss decreased after the intervention but that resident physicians were not satisfied with the nurses-physician information exchange. The staff and resident physicians appeared to have distinct and different information needs and perspectives, while the nurses felt that the Bullet Rounds process had improved, but that it needed revisiting, and were not aligned with staff physicians on respective roles and responsibilities. The overall results suggest that even after the process engineering intervention, there was still misalignment of goals and scripts between the two groups of physicians and between the physicians and nurses, and strategies for addressing these gaps are proposed.
66

A Stochastic Inventory Model with Price Quotation

Liu, Jun 24 September 2009 (has links)
This thesis studies a single item periodic review inventory problem with stochastic demand, random price and quotation cost. It differs from the traditional inventory model in that at the beginning of each period, a decision is made whether to pay the quotation cost to get the price information. If it is decided to request a price quote then the next decision is on how many units to order; otherwise, there will be no order. An (r, S1, S2) policy with r < S2, S1 <= S2 is proposed for the problem with two prices. It prescribes that when the inventory is less than or equal to r, the price quotation is requested; if the higher price is quoted, then order up to S1, otherwise to S2. There are two cases, r < S1 or S1 <= r. In the first case, every time the price is quoted, an order is placed. It is a single reorder point two order-up-to levels policy that can be considered as an extension of the (s, S) policy. In the second case, S1 <= r, it is possible to “request a quote but not buy” if the quoted price is not favorable when the inventory is between S1 and r. Two total cost functions are derived for the cases r < S1 <= S2 and S1 <= r < S2 respectively. Then optimization algorithms are devised based on the properties of the cost functions and tested in numerical study. The algorithms successfully find the optimal policies in all of the 135 test cases. Compared to the exhaustive search, the running time of the optimization algorithm is reduced significantly. The numerical study shows that the optimal (r, S1, S2) policy can save up to 50% by ordering up to different levels for different prices, compared to the optimal (s, S) policy. It also reveals that in some cases it is optimal to search price speculatively, that is with S1 < r, to request a quote but only place an order when the lower price is realized, when the inventory level is between S1 and r.
67

Advances in Portfolio Selection Under Discrete Choice Constraints: A Mixed-integer Programming Approach and Heuristics

Stoyan, Stephen J. 03 March 2010 (has links)
Over the last year or so, we have witnessed the global effects and repercussions related to the field of finance. Supposed blue chip stocks and well-established companies have folded and filed for bankruptcy, an event that might have thought to been absurd two years ago. In addition, finance and investment science has grown over the past few decades to include a plethora of investment options and regulations. Now more than ever, developments in the field are carefully examined and researched by potential investors. This thesis involves an investigation and quantitative analysis of key money management problems. The primary area of interest is Portfolio Selection, where we develop advanced financial models that are designed for investment problems of the 21st century. Portfolio selection is the process involved in making large investment decisions to generate a collection of assets. Over the years the selection process has evolved dramatically. Current portfolio problems involve a complex, yet realistic set of managing constraints that are coupled to general historic risk and return models. We identify three well-known portfolio problems and add an array of practical managing constraints that form three different types of Mixed-Integer Programs. The product is advanced mathematical models related to risk-return portfolios, index tracking portfolios, and an integrated stock-bond portfolio selection model. The numerous sources of uncertainty are captured in a Stochastic Programming framework, and Goal Programming techniques are used to facilitate various portfolio goals. The designs require the consideration of modelling elements and variables with respect to problem solvability. We minimize trade-offs in modelling and solvability issues found in the literature by developing problem specific algorithms. The algorithms are tailored to each portfolio design and involve decompositions and heuristics that improve solution speed and quality. The result is the generation of portfolios that have intriguing financial outcomes and perform well with respect to the market. Portfolio selection is as dynamic and complex as the recent economic situation. In this thesis we present and further develop the mathematical concepts related to portfolio construction. We investigate the key financial problems mentioned above, and through quantitative financial modelling and computational implementations we introduce current approaches and advancements in field of Portfolio Optimization.
68

An Integrated Two-stage Innovation Planning Model with Market Segmented Learning and Network Dynamics

Ferreira, Kevin D. 28 February 2013 (has links)
Innovation diffusion models have been studied extensively to forecast and explain the adoption process for new products or services. These models are often formulated using one of two approaches: The first, and most common is a macro-level approach that aggregates much of the market behaviour. An advantage of this method is that forecasts and other analyses may be performed with the necessity of estimating few parameters. The second is a micro-level approach that aims to utilize microeconomic information pertaining to the potential market and the innovation. The advantage of this methodology is that analyses allow for a direct understanding of how potential customers view the innovation. Nevertheless, when individuals are making adoption decisions, the reality of the situation is that the process consists of at least two stages: First, a potential adopter must become aware of the innovation; and second the aware individual must decide to adopt. Researchers, have studied multi-stage diffusion processes in the past, however a majority of these works employ a macro-level approach to model market flows. As a result, a direct understanding of how individuals value the innovation is lacking, making it impossible to utilize this information to model realistic word-of-mouth behaviour and other network dynamics. Thus, we propose a two-stage integrated model that utilizes the benefits of both the macro- and micro-level approaches. In the first stage, potential customers become aware of the innovation, which requires no decision making by the individual. As a result, we employ a macro-level diffusion process to describe the first stage. However, in the second stage potential customers decide whether to adopt the innovation or not, and we utilize a micro-level methodology to model this. We further extend the application to include forward looking behaviour, heterogeneous adopters and segmented Bayesian learning, and utilize the adopter's satisfaction levels to describe biasing and word-of-mouth behaviour. We apply the proposed model to Canadian colour-TV data, and cross-validation results suggest that the new model has excellent predictive capabilities. We also apply the two-stage model to early U.S. hybrid-electric vehicle data and results provide insightful managerial observations.
69

Reverse Engineering of Content as a Task for Finding Usability Problems: An Evaluative Case Study using the Wikibreathe Tool for Online Creation of Asthma Action Plans

Wan, Flora 17 February 2010 (has links)
After formulating the problem and reviewing relevant research literature, a study was performed that compared reverse engineering with traditional scenario-based techniques in usability evaluation. In this case study, an online tool for creating asthma action plans was created and evaluated through questionnaires and focus groups. The tool was then tested in a controlled study using both a traditional scenario-based approach and the reverse engineering method. A group of twelve users built asthma action plans using each method in a randomized order. Results concerning usability, efficiency and the types of usability problems found were reported, along with recommendations for further research in the use of reverse engineering as a method of usability evaluation.
70

Task Re-allocation Methodologies for Teams of Autonomous Agents in Dynamic Environments

Sheridan, Patricia Kristine 25 August 2011 (has links)
Two on-line task re-allocation methodologies capable of re-allocating agents to tasks on-line for minimum task completion time in dynamic environments are presented herein. The first methodology, the Dynamic Nearest Neighbour (DNN) Policy, is proposed for the operation of a fleet of vehicles in a city-like application of the dial-a-ride problem. The second methodology, the Dynamic Re-Pairing Methodology (DRPM) is proposed for the interception of a group of mobile targets by a dynamic team of robotic pursuers, where the targets are assumed to be highly maneuverable with a priori unknown, but real-time trackable, motion trajectories. Extensive simulations and experiments have verified the DNN policy to be tangibly superior to the first-come-first-served and nearest neighbour policies in minimizing customer mean system time, and the DRPM to be tangibly efficient in the optimal dynamic re-pairing of multiple mobile pursuers to multiple mobile targets for minimum total interception time.

Page generated in 0.0163 seconds