• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 8
  • 8
  • 8
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

THE RELATIONSHIP BETWEEN COMPONENT AND PRODUCT QUALITY IN MANUFACTURING, WITH EMPHASIS ON COMPETITIVENESS

Yue Wang (10710720) 27 April 2021 (has links)
<p>The capability of continuously producing good quality products with high productivity and low cost is critical for manufacturers. Generally, products are made up of components, which enable the product to perform its purpose. A complex product may be assembled from many components through multiple assembly stages. Any quality defects in a component may build up in the product. A good understanding of how the quality of components impacts the quality of products in a complex manufacturing system is essential for keeping the competitiveness of a manufacturer. </p> <p>In this research, a series of quality management models are proposed based on studying the relationship between component quality and product quality. Optimal quality control leads to increased competitiveness of a manufacturer, since it helps reduce cost, increase production, and limit environmental impact. The research starts from studying the tolerance allocation problem, which is fundamental of managing the tradeoff between quality, productivity, cost, and waste. First, a tolerance allocation method that minimizes cost is proposed. This model jointly considers process variation and tolerance specifications. The relation between manufacturer, user, design, and processing are embedded in the cost model. To solve the tolerance allocation problem from the root cause, i.e., the variations in production processes, a second tolerance allocation model is then provided. This model considers both product design (tolerance selection) and operation planning (or production rate selection). Relations among production rate, production cost, processing precision, and waste are considered. Furthermore, a new process control model that extends traditional statistical process control techniques is proposed. Data acquired from a manufacturing system are usually in the forms of time series, and anomalies in the time series are generally related to quality defects. A new method that can detect anomalies in time series data with long length and high dimensionality is developed. This model is based on recurrent neural networks, and the parameters of the neural networks can be trained using data acquired during routine operation of a manufacturing system. This is very beneficial because often, there are few data labeled as anomalies, since anomalies are hopefully rare events in a well-managed system. Last, quality control of remanufacturing is studied. A component-oriented reassembly model is proposed to manage the varied quality of returned component and varied needs of customers. In this model, returned components are inspected and assigned scores according to their quality/function, and categorized in a reassembly inventory. Based on the reassembly inventory, components are paired under the control of a reassembly strategy. A reassembly-score iteration algorithm is developed to identify the optimal reassembly strategy. The proposed model can reassemble products to meet a larger variety of customer needs, while simultaneously producing better remanufactured products.</p> In summary, this dissertation presents a series of novel quality management models to keep manufacturers’ competitiveness. These models are based on studying factors that impact component and product quality at multiple stages of a product life cycle. It was found that analyzing the relationship between component and product quality is a very effective way of improving product quality, saving cost, and reducing environmental impact of manufacturing.
2

The Sustainable Manufacturing System Design Decomposition

Onkar V Sonur (9726050) 16 December 2020 (has links)
<div>With the growing importance of the manufacturing sector, there is a tremendous demand for finding innovative ways to design manufacturing systems. Although several design methodologies are available for devising the manufacturing systems, most of the changes do not sustain for a longer period. Numerous elements contribute to issues that impede sustainability in manufacturing industries, such as the common design approach of applying solutions without understanding system requirements and appropriate thinking processes. </div><div>With a Sustainable Manufacturing System Design Decomposition (SMSDD), the precise pitfalls and areas of improvement can be well understood. </div><div>The SMSDD fosters members in the organization to collectively map the customer’s needs, identifying the requirements of the system design and the associated solutions. In this thesis, SMSDD is developed to design manufacturing systems for maximizing the potential of an enterprise to create an efficient and sustainable manufacturing system. </div><div> </div><div>In addition to being able to design new manufacturing systems or to re-design existing manufacturing systems, the SMSDD provides a potent tool to analyze the design of existing manufacturing systems. SMSDD uses the Collective System Design Methodology steps to design a manufacturing system for leading to efficient and sustainable manufacturing system. Therefore, SMSDD can apply to a broad range of manufacturing systems. </div><div><br></div>
3

<b>Quality Control for Manufactured Weight Plates</b>

Austin Joseph Bridenthal (16485171) 26 April 2024 (has links)
<p dir="ltr">The study aims to prove the need for higher quality production of weight plates which fall under US6746380B2 (expired May 10, 2021), assigned to USA Sports Inc. Literature justifies quality control standards. Selected literature validates posed hypotheses, and a product study is completed using weight plate specimens selected for physical quality testing to prove the common (repeated offense) existence of weight plates which fall out of the designated weight tolerance. Findings of physical product testing are collected and set against each other to determine differences in levels of quality control. Through extensive product testing (quantified within the study’s research methodology), novel quality control ideals are identified for product improvement in the study’s recommendations. Next steps are suggested to improve the understanding and utilization of quality control to work toward creating a sustainable and consistently high-quality product. Findings from the study are available to be used amongst companies in the fitness industry who produce weight plates.</p>
4

A tale of two applications: closed-loop quality control for 3D printing, and multiple imputation and the bootstrap for the analysis of big data with missingness

Wenbin Zhu (12226001) 20 April 2022 (has links)
<div><b>1. A Closed-Loop Machine Learning and Compensation Framework for Geometric Accuracy Control of 3D Printed Products</b></div><div><b><br></b></div>Additive manufacturing (AM) systems enable direct printing of three-dimensional (3D) physical products from computer-aided design (CAD) models. Despite the many advantages that AM systems have over traditional manufacturing, one of their significant limitations that impedes their wide adoption is geometric inaccuracies, or shape deviations between the printed product and the nominal CAD model. Machine learning for shape deviations can enable geometric accuracy control of 3D printed products via the generation of compensation plans, which are modifications of CAD models informed by the machine learning algorithm that reduce deviations in expectation. However, existing machine learning and compensation frameworks cannot accommodate deviations of fully 3D shapes with different geometries. The feasibility of existing frameworks for geometric accuracy control is further limited by resource constraints in AM systems that prevent the printing of multiple copies of new shapes.<div><br></div><div>We present a closed-loop machine learning and compensation framework that can improve geometric accuracy control of 3D shapes in AM systems. Our framework is based on a Bayesian extreme learning machine (BELM) architecture that leverages data and deviation models from previously printed products to transfer deviation models, and more accurately capture deviation patterns, for new 3D products. The closed-loop nature of compensation under our framework, in which past compensated products that do not adequately meet dimensional specifications are fed into the BELMs to re-learn the deviation model, enables the identification of effective compensation plans and satisfies resource constraints by printing only one new shape at a time. The power and cost-effectiveness of our framework are demonstrated with two validation experiments that involve different geometries for a Markforged Metal X AM machine printing 17-4 PH stainless steel products. As demonstrated in our case studies, our framework can reduce shape inaccuracies by 30% to 60% (depending on a shape's geometric complexity) in at most two iterations, with three training shapes and one or two test shapes for a specific geometry involved across the iterations. We also perform an additional validation experiment using a third geometry to establish the capabilities of our framework for prospective shape deviation prediction of 3D shapes that have never been printed before. This third experiment indicates that choosing one suitable class of past products for prospective prediction and model transfer, instead of including all past printed products with different geometries, could be sufficient for obtaining deviation models with good predictive performance. Ultimately, our closed-loop machine learning and compensation framework provides an important step towards accurate and cost-efficient deviation modeling and compensation for fully 3D printed products using a minimal number of printed training and test shapes, and thereby can advance AM as a high-quality manufacturing paradigm.<br></div><div><br></div><div><b>2. Multiple Imputation and the Bootstrap for the Analysis of Big Data with Missingness</b></div><div><br></div><div>Inference can be a challenging task for Big Data. Two significant issues are that Big Data frequently exhibit complicated missing data patterns, and that the complex statistical models and machine learning algorithms typically used to analyze Big Data do not have convenient quantification of uncertainties for estimators. These two difficulties have previously been addressed using multiple imputation and the bootstrap, respectively. However, it is not clear how multiple imputation and bootstrap procedures can be effectively combined to perform statistical inferences on Big Data with missing values. We investigate a practical framework for the combination of multiple imputation and bootstrap methods. Our framework is based on two principles: distribution of multiple imputation and bootstrap calculations across parallel computational cores, and the quantification of sources of variability involved in bootstrap procedures that use subsampling techniques via random effects or hierarchical models. This framework effectively extends the scope of existing methods for multiple imputation and the bootstrap to a broad range of Big Data settings. We perform simulation studies for linear and logistic regression across Big Data settings with different rates of missingness to characterize the frequentist properties and computational efficiencies of the combinations of multiple imputation and the bootstrap. We further illustrate how effective combinations of multiple imputation and the bootstrap for Big Data analyses can be identified in practice by means of both the simulation studies and a case study on COVID infection status data. Ultimately, our investigation demonstrates how the flexible combination of multiple imputation and the bootstrap under our framework can enable valid statistical inferences in an effective manner for Big Data with missingness.<br></div>
5

Predictive Quality Analytics

Salim A Semssar (11823407) 03 January 2022 (has links)
Quality drives customer satisfaction, improved business performance, and safer products. Reducing waste and variation is critical to the financial success of organizations. Today, it is common to see Lean and Six Sigma used as the two main strategies in improving Quality. As advancements in information technologies enable the use of big data, defect reduction and continuous improvement philosophies will benefit and even prosper. Predictive Quality Analytics (PQA) is a framework where risk assessment and Machine Learning technology can help detect anomalies in the entire ecosystem, and not just in the manufacturing facility. PQA serves as an early warning system that directs resources to where help and mitigation actions are most needed. In a world where limited resources are the norm, focused actions on the significant few defect drivers can be the difference between success and failure
6

DEVELOPMENT AND VALIDATION OF A VERSATILE AND INNOVATIVE TOOL TO ASSESS AND BENCHMARK SUSTAINABILITY PERFORMANCE OF ORGANIZATIONS AND SUPPLY CHAINS

Cagatay Tasdemir (6580142) 10 June 2019 (has links)
<a>Global trends and factors, such as the increased level of globalization, climate change, resource scarcity, and awareness of social and environmental responsibilities, as well as fiercer competition and lower profit margins in all industries, force organizations to act to retain, regain, or sustain their competitive advantages for long-term survival. These trends and factors are historically known to bring about innovations that drive the evolution of industries. Sustainability is considered to be such an innovation to achieve fiscally sound, environmentally conscious, and socially progressive organizations and supply chains. Sustainable Development and Sustainability notions are among trending topics of 21st century. Elevated sustainability concerns of various stakeholders have been forcing members of all industries to evolve into their more environmentally and socially responsible versions. This study was initiated through a comprehensive literature review phase that reviewed 477 articles published in five major databases from 1990 to 2018. The purpose of this review was to assess the current state-of-the art on the subject of lean-driven sustainability. Based on descriptive and contextual analysis, synergies, divergences, and the extent of two-way permeability of lean and sustainability concepts from the perspective of intra- and inter-organizational operations were identified along with future research opportunities. Fundamental strengths and weaknesses of both concepts, existing strong synergies and untapped potential, along with their key contributors, the potential-use cases of lean tools to derive sustainable solutions are highlighted in this review. Next, based on the findings of systematic literature review, an innovative, holistic, versatile and scalable tool was developed to assess and benchmark sustainability performance of organizations and supply chains. The proposed framework was established upon trivet structure of Triple Bottom Line philosophy and fueled by Lean, Six-Sigma and Life Cycle Assessment (LCA) methodologies for accurate and effective measurement of sustainability performance. Completeness of the framework was ensured through development of first-generation Key Performance Indicator (KPI) pool with 33 indicators, a unique work environment assessment mechanism for safety and environmental protection issues in terms of 11 risk categories and by construction of an ownership structure for ease of framework deployment. Proposed framework is expected to help with true sustainability performance improvement and benchmarking objectives at a range of business levels from facility to sectoral operations. Upon completion of the development phase, the Sustainability Benchmarking Tool (SBT) Framework was validated at the facility level within the context of value-added wood products manufacturing. Strengths and weaknesses of the system were identified within the scope of Bronze Frontier maturity level of the framework and tackled through a six-step analytical and quantitative reasoning methodology. The secondary objective of the validation phase was to document how value-added wood products industries can take advantage of natural properties of wood to become frontiers of sustainability innovation. In the end, True Sustainability performance of the target facility was improved by 2.37 base points, while economic and environmental performance was increased from being a system weakness to achieving an acceptable index score benchmark of 8.41 and system strength level of 9.31, respectively. Social sustainability score increased by 2.02 base points as a function of better gender bias ratio. The financial performance of the system improved from a 33% loss to 46.23% profit in the post-improvement state. Reductions in CO<sub>2</sub> emissions (55.16%), energy consumption (50.31%), solid waste generation (72.03%), non-value-added-time (89.30%) and cost performance (64.77%) were other significant achievements of the study. In the end, SBT Framework was successfully validated at the facility level and target facility evolved into its leaner, cleaner and more responsible version of itself. Furthermore, manufacturing industries of all sorts are key stakeholders, which rely on universities to satisfy the demand for competent workforce. Society also expects universities to educate youth and contribute to their self-development by achieving both, scientific and intellectual knowledge saturation. To expand the contribution of the study to the body of knowledge in the fields of Sustainability and Modern Management techniques, an undergraduate level course curriculum that integrates modern management techniques and sustainability concepts with wood products industry dynamics was developed. Students’ pre- and post-education awareness of, and familiarity with sustainability, potential consequences of ignored sustainability issues, modern management techniques, global trends, innovation waves, and industry evolution were compared through a seventeen-question survey. Results showed that course content was successful at increasing sustainability awareness at both overall and individual sustainability pillar levels, At the end, 100% of students were able to develop complete understanding of various modern management techniques and stated that they felt confident to apply learnt skills to real life issues within their profession upon graduation. Overall, this study empirically documented how synergies between Lean, Sustainability, Six-Sigma and Life Cycle Assessment concepts outweigh their divergences, demonstrated viability of SBT Framework and presented a proven example of modern management techniques powered transdisciplinary sustainability curriculum.</a>
7

<b>Automation of the Quality Control Process with the use of robotics and a coordinate Measuring Machine</b>

Alexander G Hoang (16677327) 02 August 2023 (has links)
<p>The purpose of this research experiment was to explore and implement a cost-effective automation solution into a low volume production line for loading parts onto a coordinate measuring machine (CMM) for dimensional inspection. Quality control practices have historically been separated from production process by inspection routines being performed in a controlled lab. The system demonstrated the possibilities of an in-process automation of the quality control process that was feasible to be implemented for small and mid-sized manufacturing companies. The process involved an APSX horizontal injection mold machine dispensing parts onto the conveyor belt. The conveyor belt was controlled by a Phoenix Contact PLC and two line sensors that provided two stopping point for cooldown before inspection. A MyCobot 320-M5 robotic arm was used to select the part off the line and places it into a fixture on a Hexagon coordinate measuring machine (CMM).</p>
8

Development and Evaluation of a Machine Vision System for Digital Thread Data Traceability in a Manufacturing Assembly Environment

Alexander W Meredith (15305698) 29 April 2023 (has links)
<p>A thesis study investigating the development and evaluation of a computer vision (CV) system for a manufacturing assembly task is reported. The CV inference results are compared to a Manufacturing Process Plan and an automation method completes a buyoff in the software, Solumina. Research questions were created and three hypotheses were tested. A literature review was conducted recognizing little consensus of Industry 4.0 technology adoption in manufacturing industries. Furthermore, the literature review uncovered the need for additional research within the topic of CV. Specifically, literature points towards more research regarding the cognitive capabilities of CV in manufacturing. A CV system was developed and evaluated to test for 90% or greater confidence in part detection. A CV dataset was developed and the system was trained and validated with it. Dataset contextualization was leveraged and evaluated, as per literature. A CV system was trained from custom datasets, containing six classes of part. The pre-contextualization dataset and post-contextualization dataset was compared by a Two-Sample T-Test and statistical significance was noted for three classes. A python script was developed to compare as-assembled locations with as-defined positions of components, per the Manufacturing Process Plan. A comparison of yields test for CV-based True Positives (TPs) and human-based TPs was conducted with the system operating at a 2σ level. An automation method utilizing Microsoft Power Automate was developed to complete the cognitive functionality of the CV system testing, by completing a buyoff in the software, Solumina, if CV-based TPs were equal to or greater than human-based TPs.</p>

Page generated in 0.1056 seconds