• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2025
  • 247
  • 99
  • 74
  • 49
  • 17
  • 16
  • 10
  • 9
  • 7
  • 6
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 2929
  • 2929
  • 504
  • 484
  • 483
  • 482
  • 449
  • 399
  • 342
  • 332
  • 218
  • 208
  • 183
  • 177
  • 176
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Hierarchy Generation for Designing Assembly System for Product with Complex Liaison and Sub-Assembly Branches

Unknown Date (has links)
Manufacturers need to deploy their assembly systems in a timely manner to cope with expedited product development. Design of such responsive assembly systems consists of generation of assembly/subassembly operations and their hierarchies, operation-machine assignment, selections of machine types and quantities, and the material flow among machines. Exploration of all the feasible solutions to the assembly operations and their hierarchical relationships is vital to optimization of system designs. This research developed a theoretical framework based on a recursive algorithm to automatically generate all feasible and non-redundant assembly hierarchies efficiently, thereby investigating its impact on assembly system designs. Then this research further discussed the potential applications of the recursive framework in system optimization including joint determination of optimal assembly operations, operation-machine assignment, machine types and quantities, and the material flows among machines. The work was also extended to the optimization of assembly systems for the products with complex liaison relations and product families. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the Master of Science. / Fall Semester 2015. / November 16, 2015. / assembly hierarchy, assembly system, generation algorithm, product family, system optimization / Includes bibliographical references. / Hui Wang, Professor Directing Thesis; Okenwa Okoli, Committee Member; Arda Vanli, Committee Member.
12

Image Segmentation for Extracting Nanoparticles

Unknown Date (has links)
With the advent of nanotechnology, nanomaterials have drastically improved our lives in a very short span of time. The more we can tap into this resource, the more we can change our lives for better. All the applications of nanomaterials depend on how well we can synthesize the nanoparticles in accordance with our desired shape and size, as they determine the properties and thereby the functionality of the nanomaterials. Therefore in this report, it is focused on how to extract the shape of the nanoparticles from electron microscope images using image segmentation more accurately and more efficiently. By developing automated image segmentation procedure, we can systematically determine the contours of an assortment of nanoparticles from electron microscope images; reducing data examination and interpretation time substantially. As a result, the defects in the nanomaterials can be reduced drastically by providing an automated update to the parameters controlling the production of nanomaterials. The report proposes new image segmentation techniques that specifically work very effectively in extracting nanoparticles from electron microscope images. These techniques are manifested by imparting new features to Sliding Band Filter (SBF) method called Gradient Band Filter (GBF) and by amalgamating GBF with Active Contour Without Edges method, followed by fine tuning of μ (a positive parameter in Mumford-Shah functional). The incremental improvement in the performance (in terms of computation time, accuracy and false positives) of extracting nanoparticles is therefore portrayed by comparing image segmentation by SBF versus GBF, followed by comparing Active Contour Without Edges versus Active Contour Without Edges with the fusion of Gradient Band Filter (ACGBF). In addition we compare the performance of a new technique called Variance Method to fine tune the value of μ with fine tuning of μ based on ground truth, followed by gauging the improvement in the performance of image segmentation by ACGBF with fine tuned value of μ over ACGBF with an arbitrary value of μ. / A Thesis submitted to the Department of Industrial & Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Fall Semester 2015. / November 09, 2015. / Active Contours, Image Segmentation, Nanoparticles, Sliding Band Filter / Includes bibliographical references. / Chiwoo Park, Professor Directing Thesis; Abhishek Shrivastava, Committee Member; Tao Liu, Committee Member; Adrian Barbu, Committee Member.
13

Supply Prepositioning for Disaster Management

Unknown Date (has links)
This thesis studies two-stage stochastic optimization methods for supply prepositioning for hurricane relief logistics. The first stage determines where to preposition supplies and how much to preposition at a location. The second stage decides the amount of supplies distributed from supply centers to demand centers. The methods proposed are (I) a method to minimize the expected total cost (II) a method to minimize the variance of the total cost that accounts for the uncertainties of parameters of the expected cost model. For method II, a Bayesian model and a robust stochastic programming solution approach are proposed. In this approach the cost function parameters are assumed to be uncertain random variables. We propose a Mixed Integer Programming model, which can be solved efficiently using linear and nonlinear programming solvers. The resultslinear and nonlinear integer programming problems are obtained solved using CPLEX and FILMINT solvers, respectively. A computational case study comprised of real-world hurricane scenarios is conducted to illustrate how the proposed methods work on a practical problem. A buffer zone is specified in order to be sent of the commodities to a certain distance. Estimation of hurricane landfall probabilities and the effect of cost uncertainty on prepositioning decisions is considered.We propose a Mixed Integer Programming model, which can be solved efficiently using a linear and nonlinear programming solver. The results are obtained using CPLEX and FILMINT. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Spring Semester 2018. / April 18, 2018. / bayesian analysis, disaster relief, inventory management, optimization, stochastic programming / Includes bibliographical references. / Arda Vanli, Professor Directing Thesis; Hui Wang, Committee Member; Chiwoo Park, Committee Member; Eren Erman Ozguven, Committee Member.
14

Sparsity-Regularized Learning for Nano-Metrology

Unknown Date (has links)
The key objective of nanomaterial metrology is to extract relevant information on nano-structure for quantitatively correlating structure-property with functionality. Historic improvements on in- strumentation platforms has enabled comprehensive capture of the information stream both glob- ally and locally. For example, the impressive scanning transmission electron microscopy (STEM) progress is the access to vibrational spectroscopic signals such as atomically resolved electron en- ergy loss spectroscopy (EELS) and the most recent ptychography. This is particularly pertinent in the scanning probe microscopy (SPM) community that has seen a rapidly growing trend towards simultaneous capture of multiple imaging channel and increasing data sizes. Meanwhile signal pro- cessing analysis remained in the same, depending on simple physics models. This approach by definition ignores the material behaviors associated with the deviations from simple physics models and hence require more complex dynamic models. Introduction of such models, in turn, can lead to spurious growth of free parameters and potential overfitting etc. To derive signal analysis pathways necessitated by large,complex datasets generated by progress in instrumentation hardware, here we propose data-physics inference driven approaches for high- veracity and information-rich nanomaterial metrology. Mathematically, we found structural spar- sity regularizations extremely useful which are explained at corresponding applications in later chapters. In a nutshell, we overview the following contributions: 1.We proposed a physics-infused semi-parametric regression approach for estimating the size distribution of nanoparticles with DLS measurements, yielding more details of the size distribution than the traditional methodology. Our methodology expands DLS capability of characterizing heterogeneously shaped nanoparticles. 2.We proposed a two-level structural sparsity regularized regression model and correspondingly developed a variant of group orthogonal matching pursuit algorithm for simultaneously estimating global periodic structure and detecting local outlier structures in noisy STEM images. We believe this an important step toward automatic phase. 3.We develop and implement a universal real-time image reconstruction algorithm from rapid and sparse STEM scans for non-invasive and high-dynamic range imaging. We build up and opensource the systematic platform that fundamentally push the evolution of STEM for both imaging and e-beam based atom-by-atom fabrication, forming a marriage between the imaging and manipulation modes via intelligent and adaptive responses to the real-time material evolution. / A Dissertation submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Summer Semester 2018. / July 6, 2018. / Includes bibliographical references. / Chiwoo Park, Professor Directing Dissertation; Anuj Srivastava, University Representative; Zhiyong Liang, Committee Member; Arda Vanli, Committee Member.
15

Application of Experimental Design for Efficient Wind Tunnel Testing: The Tandem Wing Mav Case

Unknown Date (has links)
Micro air vehicles (MAVs) are small scale unmanned aerial vehicles (UAVs) that are used for reconnaissance, intelligence gathering and battle damage assessment. The U.S. Air Force Research Lab Munitions Directorate develops MAVs for various defense missions. The case involves a tandem wing MAV that is designed to have retractable wings for transport, control surfaces on the aft wing, and two different vertical tail configurations. Wind tunnel testing is one of the vital steps in MAV development for evaluating and ensuring that stability and control requirements are met for sustained flight. Traditionally, wind tunnel tests have been performed using a one factor at a time (OFAT) approach. Wind tunnel OFAT involves testing at many levels of one particular factor, usually angle of attack (AoA), while holding all other input factors constant; this technique is then repeated for various input factor configurations. This classic approach can be useful in determining the effect that each input alone has on the desired response. However, OFAT is not capable of identifying the influence that inputs interacting with one another have on the response, which commonly affect aircraft performance. Furthermore, OFAT is not capable of characterizing uncertainty that is present in experimentation. The research objective is to develop a testing strategy that provides an efficient number of test points to run in the wind tunnel effectively characterizing the aerodynamic behavior of MAVs as a function of design changes, changes in attitude and control inputs, while reducing costs and resources using design of experiments (DOE) and response surface methods (RSM). The research involves one of the first applications of second-order split plot designs, as well as the traditional completely randomized design. The DOE/RSM approach will be directly compared to the traditional OFAT wind tunnel testing that is performed during the same test period. The analyses resulting from the DOE/RSM approach will highlight its capabilities in identifying factor interactions, characterizing system uncertainty, and providing stability and control analyses – the common objectives of wind tunnel testing. The outcome of the study will demonstrate the effectiveness of DOE/RSM techniques when tailored to meet the specifications of wind tunnel testing. Some characteristics involved with the wind tunnel environment are low noise, qualitative factors, hard-to-change factors, and second-order models. The collaboration of experimental design techniques adapted to traditional wind tunnel testing techniques will provide a powerful approach to characterizing and optimizing aerodynamic systems. / A Thesis submitted to the Department of Industrial Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Degree Awarded: Spring Semester, 2007. / Date of Defense: April 5, 2007. / Design of Experiments, Completely Randomized Design, Response Surface Methods, Split-plot Design, Micro Air Vehicle (MAV) / Includes bibliographical references. / James R. Simpson, Professor Directing Thesis; Drew Landman, Committee Member; Okenwa I. Okoli, Committee Member; Joseph J. Pignatiello, Committee Member.
16

Fault Diagnosis in Multivariate Manufacturing Processes

Unknown Date (has links)
As manufacturing systems are becoming more complex, the use of multivariate fault detection and diagnosis methods are increasingly important. Effective fault detection and diagnosis methods can minimize cost of rework, plant down time and maintenance time and improve reliability and safety. This thesis proposes Principal Components Analysis (PCA) based root cause identification approach for quality improvement in complex manufacturing processes. Simulation studies are presented to demonstrate the improved diagnosability of the proposed approach compared to existing methods. / A Thesis submitted to the Department of Industrial and Manufacturing Engineering in partial fulfillment of the requirements for the degree of Master of Science. / Degree Awarded: Summer Semester, 2010. / Date of Defense: June 11, 2010. / Multivariate Analysis, Principal Components Analysis (PCA), Contribution Plot, Root Cause Identification / Includes bibliographical references. / Arda Vanli, Professor Directing Thesis; Ben Wang, Committee Member; Chuck Zhang, Committee Member; Joseph J. Pignatiello, Jr., Committee Member.
17

Cutting Planes for Convex Objective Nonconvex Optimization

Michalka, Alexander January 2013 (has links)
This thesis studies methods for tightening relaxations of optimization problems with convex objective values over a nonconvex domain. A class of linear inequalities obtained by lifting easily obtained valid inequalities is introduced, and it is shown that this class of inequalities is sufficient to describe the epigraph of a convex and differentiable function over a general domain. In the special case where the objective is a positive definite quadratic function, polynomial time separation procedures using the new class of lifted inequalities are developed for the cases when the domain is the complement of the interior of a polyhedron, a union of polyhedra, or the complement of the interior of an ellipsoid. Extensions for positive semidefinite and indefinite quadratic objectives are also studied. Applications and computational considerations are discussed, and the results from a series of numerical experiments are presented.
18

Novel Statistical Learning Methods for Multi-Modality Heterogeneous Data Fusion in Health Care Applications

January 2019 (has links)
abstract: With the development of computer and sensing technology, rich datasets have become available in many fields such as health care, manufacturing, transportation, just to name a few. Also, data come from multiple heterogeneous sources or modalities. This is a common phenomenon in health care systems. While multi-modality data fusion is a promising research area, there are several special challenges in health care applications. (1) The integration of biological and statistical model is a big challenge; (2) It is commonplace that data from various modalities is not available for every patient due to cost, accessibility, and other reasons. This results in a special missing data structure in which different modalities may be missed in “blocks”. Therefore, how to train a predictive model using such a dataset poses a significant challenge to statistical learning. (3) It is well known that different modality data may contain different aspects of information about the response. The current studies cannot afford to solve this problem. My dissertation includes new statistical learning model development to address each of the aforementioned challenges as well as application case studies using real health care datasets, included in three chapters (Chapter 2, 3, and 4), respectively. Collectively, it is expected that my dissertation could provide a new sets of statistical learning models, algorithms, and theory contributed to multi-modality heterogeneous data fusion driven by the unique challenges in this area. Also, application of these new methods to important medical problems using real-world datasets is expected to provide solutions to these problems, and therefore contributing to the application domains. / Dissertation/Thesis / Doctoral Dissertation Industrial Engineering 2019
19

Extensions of the dual-resource constrained flexible job-shop scheduling problem

January 2019 (has links)
abstract: The shift in focus of manufacturing systems to high-mix and low-volume production poses a challenge to both efficient scheduling of manufacturing operations and effective assessment of production capacity. This thesis considers the problem of scheduling a set of jobs that require machine and worker resources to complete their manufacturing operations. Although planners in manufacturing contexts typically focus solely on machines, schedules that only consider machining requirements may be problematic during implementation because machines need skilled workers and cannot run unsupervised. The model used in this research will be beneficial to these environments as planners would be able to determine more realistic assignments and operation sequences to minimize the total time required to complete all jobs. This thesis presents a mathematical formulation for concurrent scheduling of machines and workers that can optimally schedule a set of jobs while accounting for changeover times between operations. The mathematical formulation is based on disjunctive constraints that capture the conflict between operations when trying to schedule them to be performed by the same machine or worker. An additional formulation extends the previous one to consider how cross-training may impact the production capacity and, for a given budget, provide training recommendations for specific workers and operations to reduce the makespan. If training a worker is advantageous to increase production capacity, the model recommends the best time window to complete it such that overlaps with work assignments are avoided. It is assumed that workers can perform tasks involving the recently acquired skills as soon as training is complete. As an alternative to the mixed-integer programming formulations, this thesis provides a math-heuristic approach that fixes the order of some operations based on Largest Processing Time (LPT) and Shortest Processing Time (SPT) procedures, while allowing the exact formulation to find the optimal schedule for the remaining operations. Computational experiments include the use of the solution for the no-training problem as a starting feasible solution to the training problem. Although the models provided are general, the manufacturing of Printed Circuit Boards are used as a case study. / Dissertation/Thesis / Masters Thesis Industrial Engineering 2019
20

Decision making under uncertainty in the emergency department: studying the effects of cognitive biases in the diagnosis of sepsis

Noonan, Thomas Zachary 01 May 2018 (has links)
This was a retrospective study analyzing the diagnosis of sepsis, a severe systemic reaction to infection, in the emergency department. Sepsis is one of the leading causes of hospital mortality. Though, despite an increased focus on sepsis awareness in recent years, the rates of sepsis are increasing. Both the root causes and the bodily effects of sepsis are varied which makes screening (the identification of potentially septic patients) and diagnosis (the identification of sepsis by a medical professional) extremely difficult. In the face of this uncertainty, several attempts have been made to formalize the definition of sepsis including the systemic inflammation response syndrome (SIRS) criteria. These well-defined criteria can be used to design screens for identifying septic patients via their electronic health record (EHR), but these alerts tend to not be very selective and as such they produce many false alarms. The aim of this study was to determine how these alerts effect the decision making of physicians in the emergency department in regard sepsis diagnosis. More specifically, the goal was to determine if any of a number of well-known cognitive biases: sequential contrast effects, confirmation bias, and representativeness, could be detected in relation to sepsis diagnosis. Using a retrospective dataset of patients for which SIRS alerts were triggered, a set of behavioral criteria were designed using standard sepsis treatment procedures to determine the physicians’ diagnoses of those patients. The distribution of these diagnoses and the way past alerts were related to the diagnosis rates were analyzed. The patterns found in these analyses were constant with that would be expected in decisions made under the influence the identified biases. Additionally, there was found to be correlation between past alerts and the amount of information physicians use to make diagnoses lending further evidence of this conclusion. These results could be used to help design better alerts in the future or to improve the way medical information is presented to physicians to prevent biases from occurring in sepsis diagnosis.

Page generated in 0.0854 seconds