• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 63
  • 1
  • Tagged with
  • 227
  • 85
  • 71
  • 44
  • 30
  • 30
  • 27
  • 26
  • 26
  • 22
  • 22
  • 21
  • 21
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Mitigating the impact of gifts-in-kind: an approach to strategic humanitarian response planning using robust facility location

Ingram, Elijah E. January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Jessica L. Heier Stamm / Gifts-in-kind (GIK) donations negatively affect the humanitarian supply chain at the point of receipt near the disaster site. In any disaster, as much as 50 percent of GIK donations are irrelevant to the relief efforts. This proves to be a significant issue to humanitarian organizations because the quantity and type of future GIK are uncertain, making it difficult to account for GIK donations at the strategic planning level. The result is GIK consuming critical warehouse space and manpower. Additionally, improper treatment of GIK can result in ill-favor of donors and loss of donations (both cash and GIK) and support for the humanitarian organization. This thesis proposes a robust facility location approach that mitigates the impact of GIK by providing storage space for GIK and pre-positions supplies to meet initial demand. The setting of the problem is strategic planning for hurricane relief along the Gulf and Atlantic Coasts of the United States. The approach uses a robust scenario-based method to account for uncertainty in both demand and GIK donations. The model determines the location and number of warehouses in the network, the amount of pre-positioned supplies to meet demand, and the amount of space in each warehouse to alleviate the impact of GIK. The basis of the model is a variant of the covering facility location model that must satisfy all demand and GIK space requirements. A computational study with multiple cost minimizing objective functions illustrates how the model performs with realistic data. The results show that strategic planning in the preparedness phases of the disaster management cycle will significantly mitigate the impact of GIK.
122

Investigations on power consumption, pelleting temperature, pellet quality, and sugar yield in pelleting of cellulosic biomass

Zhang, Qi January 1900 (has links)
Doctor of Philosophy / Department of Industrial and Manufacturing Systems Engineering / Zhijian Pei / Donghai Wang / The U.S. economy has been depending on petroleum-based liquid transportation fuels (such as gasoline, diesel, and jet fuels). Currently, about 50% of petroleum used in the U.S. is imported. Petroleum is a finite and non-renewable energy source and its use emits greenhouse gases. Therefore, it is extremely important to develop domestic sustainable alternatives for petroleum-based liquid transportation fuels. Ethanol produced from cellulosic biomass can be such an alternative. However, several technical barriers have hindered large-scale, cost-effective manufacturing of cellulosic ethanol. One such barrier is related to the low density of cellulosic feedstocks, causing high cost in their transportation and storage. Another barrier is low efficiency in conversion of cellulose to fermentable sugar (pretreatment and enzymatic hydrolysis are two major conversion processes), causing high cost in pretreatment and enzymatic hydrolysis of cellulosic biomass. Ultrasonic vibration-assisted (UV-A) pelleting increases both density and sugar yield of cellulosic feedstocks. Incorporating UV-A pelleting into cellulosic ethanol manufacturing may help realize cost-effective manufacturing of cellulosic ethanol. This PhD dissertation consists of 13 chapters. An introduction is given in Chapter 1. Chapter 2 presents a literature review on related topics. Experimental studies regarding effects of input parameters (such as particle size, pressure, and ultrasonic power) on output parameters (density, durability, stability, and sugar yield) are presented in Chapters 3–4. In Chapters 5–6, comparisons are made between UV-A pelleting and ring-die pelleting (a traditional pelleting method) in terms of pellet properties (density and durability), power consumption, and sugar yield under different conditions. Next, effects of input parameters (such as biomass type, particle size, moisture content, pelleting pressure, and ultrasonic power) on power consumption are studied in Chapters 7–9. Chapter 10 presents an investigation on biomass temperature in UV-A pelleting. Chapter 11 presents an investigation on effects of UV-A pelleting on sugar yield and chemical composition of cellulosic biomass. Chapter 12 presents an investigation on influence of UV-A pelleting on biomass characteristics (such as crystallinity index, thermal properties, and morphological structure). Finally, conclusions are presented in Chapter 13.
123

The BCS algorithm: optimizing crane schedules on multiple bays in conjunction with continuous time simulation

Strieby, James January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd Easton / This thesis introduces the Bay Crane Scheduling (BCS) problem and related BCS algorithm. The purpose of this algorithm is to optimize the assignment of jobs to overhead cranes as well as the sequence in which each crane performs its assigned jobs. This problem is unique from other Overhead Crane Scheduling (OCS) problems through its increased complexity. Up until now, OCS problems involve a set number of cranes operating in a single common area, referred to as a bay, and are unable to pass over each other. The BCS problem involves a varying number of active cranes operating in multiple bays. Each crane is allowed to move from one bay to the next, through specific locations called bridges. This is crucial to completing certain “special” jobs that require two cranes operating in unison to transport an item. The BCS algorithm employs two continuous time simulations in conjunction with an initial job-assignment algorithm and a Simulated Annealing (SA) improvement heuristic in order to minimize the non-productive crane time, while avoiding overloading any crane. To the extent of the author’s knowledge, this is the first time a continuous time simulation has been used to model an OC system. The BCS algorithm was originally developed for a large manufacturing facility, and when it was tested against the facility’s current scheduling methods, it shows a 20% improvement in the overall active crane time required to complete equivalent set of jobs. This improved efficiency is crucial to the manufacturing facility being able to increase its production rate without the addition of new cranes. In addition, BCS is statistically shown to be superior to the current strategy. The results from BCS are substantial and practitioners are encouraged to utilize BCS’s methodologies to improve other overhead crane systems.
124

Accounting for individual choice in public health emergency response planning

Martin, Christopher A. January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Jessica L. Heier Stamm / During public health emergencies, organizations in charge require an immediate and e ffcient method of distributing supplies over a large scale area. Due to the uncertainty of where individuals will choose to receive supplies, these distribution strategies have to account for the unknown demand at each facility. Current techniques rely on population ratios or requests by health care providers. This can lead to an increased disparity in individuals' access to the medical supplies. This research proposes a mathematical programming model, along with a solution methodology to inform distribution system planning for public health emergency response. The problem is motivated by distribution planning for pandemic influenza vaccines or countermeasures. The model uses an individual choice constraint to determine what facility the individual will choose to receive their supplies. This model also determines where to allocate supplies in order to meet the demand of each facility. The model was solved using a decomposition method. This method allows large problems to be solved quickly without losing equity in the solution. In the absence of publicly-available data on actual distribution plans from previous pandemic response e fforts, the method is applied to another representative data set. A computational study of the equity and number of people served depict how the model performed compared to the actual data. The results show that implementing an individual choice constraint will improve the effectiveness of a public health emergency response campaign without losing equity. The thesis provides several contributions to prior research. The first contribution is an optimization model that implements individual choice in a constraint. This determines where individuals will choose to receive their supplies so improved decisions can be made about where to allocate the resources. Another contribution provided is a solution methodology to solve large problems using a decomposition method. This provides a faster response to the public health emergency by splitting the problem into smaller subproblems. This research also provides a computational study using a large data set and the impact of using a model that accounts for individual choice in a distribution campaign.
125

Quality engineering applications on single and multiple nonlinear profiles

Chou, Shih-Hsiung January 1900 (has links)
Doctor of Philosophy / Department of Industrial and Manufacturing Systems Engineering / Shing I. Chang / Profile analysis has drawn attention in quality engineering applications due to the growing use of sensors and information technologies. Unlike the conventional quality characteristics of interest, a profile is formed functionally dependent on one or more explanatory variables. A single profile may contain hundred or thousand data points. The conventional charting tools cannot handle such high dimensional datasets. In this dissertation, six unsolved issues are investigated. First, Chang and Yadama’s method (2010) shows competitive results in nonlinear profile monitoring. However, the effectiveness of removing noise from given nonlinear profile by using B-splines fitting with and without wavelet transformation is unclear. Second, many researches dealt with profile analysis problem considering whether profile shape change only or variance change only. Those methods cannot identify whether the process is out-of-control due to mean or variance shift. Third, methods dealing with detecting profile shape change always assume that a gold standard profile exists. The existing profile shape change detecting methods are hard to be implemented directly. Fourth, multiple nonlinear profiles situation may exist in real world applications, so that conventional single profile analysis methods may result in high false alarm rate when dealing multiple profile scenario. Fifth, Multiple nonlinear profiles situation may be also happened in designs of experiment. In a conventional experimental design, the response variable is usually considered a single value or a vector. The conventional approach cannot deal with when the format of the response factor as multiple nonlinear profiles. Finally, profile fault diagnosis is an important step after detecting out-of-control signal. However, current approaches will lead to large number of combinations if the number of sections is too large. The organization of this dissertation is as following. Chapter 1 introduce the profile analysis, current solutions, and challenges; Chapter 2 to Chapter 4 explore the unsolved challenges in single profile analysis; Chapter 5 and Chapter 6 investigate multiple profiles issues in profile monitoring analysis and experimental design method. Chapter 7 proposed a novel high-dimensional diagnosis control chart to diagnose the cause of out-of-control signal via visualization aid. Finally, Chapter 8 summarizes the achievements and contributions of this research.
126

Generating cutting planes through inequality merging for integer programming problems

Hickman, Randal Edward January 1900 (has links)
Doctor of Philosophy / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Integer Programming (IP) problems are a common type of optimization problem used to solve numerous real world problems. IPs can require exponential computational effort to solve using the branch and bound technique. A popular method to improve solution times is to generate valid inequalities that serve as cutting planes. This dissertation introduces a new category of cutting planes for general IPs called inequality merging. The inequality merging technique combines two or more low dimensional inequalities, yielding valid inequalities of potentially higher dimension. The dissertation describes several theoretical results of merged inequalities. This research applies merging inequalities to a frequently used class of IPs called multiple knapsack (MK) problems. Theoretical results related to merging cover inequalities are presented. These results include: conditions for validity, conditions for facet defining inequalities, merging simultaneously over multiple cover inequalities, sequentially merging several cover inequalities on multiple variables, and algorithms that facilitate the development of merged inequalities. Examples demonstrate each of the theoretical discoveries. A computational study experiments with inequality merging techniques using benchmark MK instances. This computational study provides recommendations for implementing merged inequalities, which results in an average decrease of about 9% in computational time for both small and large MK instances. The research validates the effectiveness of using merged inequalities for MK problems and motivates substantial theoretical and computational extensions as future research.
127

The existence and usefulness of equality cuts in the multi-demand multidimensional knapsack problem

DeLissa, Levi January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Integer programming (IP) is a class of mathematical models useful for modeling and optimizing many theoretical and industrial problems. Unfortunately, IPs are NP-complete, and many integer programs cannot currently be solved. Valid inequalities and their respective cuts are commonly used to reduce the effort required to solve IPs. This thesis poses the questions, do valid equality cuts exist and can they be useful for solving IPs? Several theoretical results related to valid equalities are presented in this thesis. It is shown that equality cuts exist if and only if the convex hull is not full dimensional. Furthermore, the addition of an equality cut can arbitrarily reduce the dimension of the linear relaxation. In addition to the theory on equality cuts, the idea of infeasibility conditions are presented. Infeasibility conditions introduce a set of valid inequalities whose intersection is the empty set. infeasibility conditions can be used to rapidly terminate a branch and cut algorithm. Applying the idea of equality cuts to the multi-demand multidimensional knapsack problem resulted in a new class of cutting planes named anticover cover equality (ACE) cuts. A simple algorithm, FACEBT, is presented for finding ACE cuts in a branching tree with complexity O(m n log n). A brief computational study shows that using ACE cuts exist frequently in the MDMKP instances studied. Every instance had at least one equality cut, while one instance had over 500,000. Additionally, computationally challenging instances saw an 11% improvement in computational effort. Therefore, equality cuts are a new topic of research in IP that is beneficial for solving some IP instances.
128

The NFL true fan problem

Whittle, Scott January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Todd Easton / Throughout an NFL season, 512 games are played in 17 weeks. For a given fan that follows one team, only 16 of those games usually matter, and the rest of the games carry little significance. The goal of this research is to provide substantial reasons for fans to watch other games. This research finds the easiest path to a division championship for each team. This easiest path requires winning the least number of games. Due to NFL’s complicated tiebreaker rules, games not involving the fan’s team can have major implications for that team. The research calls these games critical because if the wrong team wins, then the fan’s team must win additional games to become the division champion. To identify both the easiest path and the critical games, integer programming is used. Given the amount of two-team, three-team, and four-team division tie scenarios that can occur, 31 separate integer programs are solved for each team to identify the easiest path to the division championship. A new algorithm, Shortest Path of Remaining Teams (SPORT) is used to iteratively search through every game of the upcoming week to determine critical games. These integer programs and the SPORT algorithm were used with the data from the previous 2 NFL seasons. Throughout these 2 seasons, it was found that the earliest a team was eliminated from the possibility of winning a division championship was week 12, and occurred in 2012 and 2013. Also, throughout these 2 seasons, there was an average of 65 critical games per season, with more critical games occurring in the 2013-2014 season. Additionally, the 2012 season was used to compare flexed scheduled games with the critical games for those weeks and it was found that the NFL missed three weeks of potentially scheduling a critical game.
129

Data envelopment analysis with sparse data

Gullipalli, Deep Kumar January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / David H. Ben-Arieh / Quest for continuous improvement among the organizations and issue of missing data for data analysis are never ending. This thesis brings these two topics under one roof, i.e., to evaluate the productivity of organizations with sparse data. This study focuses on Data Envelopment Analysis (DEA) to determine the efficiency of 41 member clinics of Kansas Association of Medically Underserved (KAMU) with missing data. The primary focus of this thesis is to develop new reliable methods to determine the missing values and to execute DEA. DEA is a linear programming methodology to evaluate relative technical efficiency of homogenous Decision Making Units, using multiple inputs and outputs. Effectiveness of DEA depends on the quality and quantity of data being used. DEA outcomes are susceptible to missing data, thus, creating a need to supplement sparse data in a reliable manner. Determining missing values more precisely improves the robustness of DEA methodology. Three methods to determine the missing values are proposed in this thesis based on three different platforms. First method named as Average Ratio Method (ARM) uses average value, of all the ratios between two variables. Second method is based on a modified Fuzzy C-Means Clustering algorithm, which can handle missing data. The issues associated with this clustering algorithm are resolved to improve its effectiveness. Third method is based on interval approach. Missing values are replaced by interval ranges estimated by experts. Crisp efficiency scores are identified in similar lines to how DEA determines efficiency scores using the best set of weights. There exists no unique way to evaluate the effectiveness of these methods. Effectiveness of these methods is tested by choosing a complete dataset and assuming varying levels of data as missing. Best set of recovered missing values, based on the above methods, serves as a source to execute DEA. Results show that the DEA efficiency scores generated with recovered values are close within close proximity to the actual efficiency scores that would be generated with the complete data. As a summary, this thesis provides an effective and practical approach for replacing missing values needed for DEA.
130

Effects of centerline rumble strips on safety, exterior noise, and operational use of the travel lane

Karkle, Daniel Edgard January 1900 (has links)
Doctor of Philosophy / Department of Industrial & Manufacturing Systems Engineering / Malgorzata J. Rys / Centerline rumble strips (CLRS) are effective in preventing cross-over crashes and are promoted in the United States (U.S.) as a low-cost safety measure. However, there may be negative issues and/or concerns that question their use under certain road conditions. This dissertation is the result of studying these issues and concerns to provide guidance to policy makers on future installations of CLRS, based on current good practices and on the results of specific investigations of exterior noise, safety effectiveness, economics, and drivers’ behavior, including their interaction with shoulders and shoulder rumble strips (SRS). From a survey conducted, good practices in the U.S. were summarized. From a before-and-after study of CLRS safety effectiveness, results showed that total correctable crashes were reduced by 29.21%. Crashes involving fatalities and injuries were reduced by 34.05%. Cross-over crashes were reduced by 67.19%, and run-off-the-road crashes were reduced by 19.19%. Both Naïve and Empirical Bayes methods were applied and showed statistically similar results. There was no statistical difference between football shaped and rectangular shaped CLRS. From the external noise study performed, it was found that external noise depends on vehicle speed, type of vehicle, and distance. Both football and rectangular CLRS substantially increased the levels of external noise at distances up to 45 m (150 ft). Therefore, before installing CLRS, the distance from houses or businesses should be considered. A distance of 60 m (200 ft) was recommended as the limit of the potential exterior noise problem area. From a study of drivers’ behavior, the analyzed configurations of rumble strips and shoulder width levels affected vehicular lateral position and speed levels, although speed deviations were not practically significant. The study of safety performance function models provided technical and economical recommendations for installation of CLRS. Overall, this study recommends the installation of CLRS on rural, two-lane, undivided rural roads in Kansas. Both patterns, rectangular and football, currently installed in Kansas have provided crash reductions, which have been reflected in economic benefits for society. Shoulder width and traffic volume should be considered as crash predictors for enhancement of the benefits. Guidelines were recommended for future better applications of CLRS.

Page generated in 0.0225 seconds