• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 360
  • 108
  • 64
  • 26
  • 17
  • 14
  • 14
  • 14
  • 13
  • 10
  • 7
  • 7
  • 6
  • 2
  • 2
  • Tagged with
  • 823
  • 823
  • 180
  • 130
  • 130
  • 108
  • 106
  • 106
  • 90
  • 81
  • 77
  • 76
  • 75
  • 74
  • 65
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

Davenport, David Alan 07 August 2013 (has links)
No description available.
162

Non-contact Methods for Detecting Hot-mix Asphalt Nonuniformity

de León Izeppi, Edgar 06 November 2006 (has links)
Segregation, or non-uniformity, in Hot Mix Asphalt (HMA) induces accelerated pavement distress(es) that can reduce a pavement's service life up to 50%. Quality Assurance procedures should detect and quantify the presence of this problem in newly constructed pavements. Current practices are usually based on visual inspections that identify non-uniform surface texture areas. An automatic process that reduces subjectivity would improve the quality-assurance procedures of HMA pavements. Virginia has undertaken a focused research effort to improve the uniformity of hot-mix asphalt (HMA) pavements. A method using a dynamic (laser-based) surface macrotexture instrument showed great promise, but it revealed that it may actually miss significant segregated areas because they only measure very thin longitudinal lines. The main objective of this research is to develop a non-contact system for the detection of segregated HMA areas and for the identification of the locations of these areas along a road for HMA quality assurance purposes. The developed system uses relatively low cost components and innovative image processing and analysis software. It computes the gray level co-occurrence matrix (GLCM) of images of newly constructed pavements to find various parameters that are commonly used in visual texture analysis. Using principal component analysis to integrate multivariable data into a single classifier, Hotelling's T2 statistic, the system then creates a list of the location of possible nonuniformities that require closer inspection. Field evaluations of the system at the Virginia Smart Road proved that it is capable of discriminating between different pavement surfaces. Verification of the system was conducted through a series of field tests to evaluate the uniformity of newly constructed pavements. A total of 18 continuous road segments of recently paved roads were tested and analyzed with the system. Tables and plots to be used by inspection personnel in the field were developed. The results of these field tests confirmed the capability of the system to detect potential nonuniformities of recently completed pavements. The system proved its potential as a useful tool in the final inspection process. / Ph. D.
163

Optical accuracy assessment of robotically assisted dental implant surgery

Klass, Dmitriy, D.D.S. 11 August 2022 (has links)
BACKGROUND: Static and dynamic dental implant guidance systems have established themselves as effective choices that result in predictable and relatively accurate dental implant placement. Generally, studies assess this accuracy using a postoperative CBCT, which has disadvantages such as additional radiation exposure for the patient. This pilot study proposed a scanbody-agnostic method of implant position assessment using intraoral scanning technology and applied it as an accuracy test of robotically assisted dental implant placement using the Neocis Yomi. MATERIALS AND METHODS: All of the robotically assisted dental implant surgery was performed in the Postdoctoral Periodontology clinic at Boston University Henry M. Goldman School of Dental Medicine. Completely edentulous patients were excluded. A total of eleven (11) implants were included in the study, eight (8) of which were fully guided. An optical impression of each implant position was obtained using a CEREC Omnicam (SW 5.1) intraoral scanner. Each sample used either a DESS Lab Scan Body or an Elos Accurate Scan Body as a means to indirectly index the position of the implant. A comparison of planned implant position versus executed surgical implant position was performed for each placement using Geomagic Control X software. Global positional and angular deviations were quantified using a proposed scanbody-agnostic method. Intraoral directionality of deviation was visually qualified by the author (D.K). RESULTS: Mean global positional deviations at the midpoints of the top of each scanbody were 1.7417 mm in the partially guided samples and 1.1300 mm in the fully guided samples. Mean global positional deviations at the midpoints of the restorative platforms of each implant were 1.3142 mm in the partially guided sample and 1.27045 mm in the fully guided samples. Mean global positional deviations at the midpoints of the apex of each implant were 1.455 mm in the partially guided samples and 1.574 mm in the fully guided samples. Mean angular deviations were 3.7492 degrees in the partially guided samples and 2.6432 degrees in the fully guided samples. CONCLUSION: Within the sample size limitations, robotically assisted dental implant surgery offers similar implant placement accuracy compared to published static and dynamic implant placement guidance systems. Intraoral optical assessment of dental implant position used in this study allows comparable analysis to other methods without requiring additional exposure to radiation and should be considered the default method of assessing guidance accuracy.
164

Increasing Program Effectiveness Through use of Principles of Andragogy in Tennessee Beef Quality Assurance Programs

McCormick, Lisa Ellis 07 July 2023 (has links)
Tennessee Beef Quality Assurance (BQA) programs teach beef producers the importance of quality within beef industries. BQA programs assure consumers of the quality and safety of supplied beef, as well as the environmental orientation of farm production practices (Tsakiridis et al., 2021). Any active BQA certificate holder in Tennessee can apply for the Tennessee Agricultural Enhancement Program (TAEP). TAEP significantly benefits both farmers and the economy. The TAEP is a cost-share system funding over $106 million dollars funding over thirty-seven thousand programs in the agricultural community statewide (Farm Bureau, Tennessee 2019 Resolutions, 2019). The cost-share program aids farmers to begin projects that could not have been financially feasible if the cost-share program was not available (Menard et al., 2019). The BQA program is an educational program taught as Cooperative Extensions efforts. The program aims to predominately adult beef cattle producers. Andragogy, also known as adult learning theory, was created by Malcolm Knowles to effectively teach adults. In this study, qualitative methods and quantitative methods were used to accurately identify how andragogy is being used in Tennessee BQA programs. The results showed Extension agents followed the seven-step andragogical design process and showed that BQA participants have the six andragogical principles. Recommendations for future research were identified to adapt the Andragogy in practice inventory for instructors, conduct a research study that addresses counties with smaller participation, and conduct studies with county agents in early career stages. Recommendations for the Tennessee BQA program are to have trainings for Extension agents around the andragogical process and to reevaluate the requirement for additional programs. / Master of Science in Life Sciences / Since BQA was established in 1987 by the Beef Checkoff, trainings across 47 states have been implemented to guide beef producers with the tools and training necessary to assure animal health and well-being. The program is an educational program that is typically taught by Extension education. Extension education was established by the Smith-Lever Act in 1914 which was established for the educational outreach of the Land-Grant institution for the growth of rural areas across the United States. This study aimed to identify how adult learning theory, andragogy, is used in Tennessee BQA programs and to make appropriate recommendations to ensure program effectiveness. This study is important to identify educational effectiveness in the BQA program and to ensure program participants are implementing program objectives to ensure the goals and purposes of the BQA program.
165

Golden Opportunities for White Collar Productivity Improvements in Quality Assurance

Algee, Jane M. 01 January 1985 (has links) (PDF)
The efficient processing of defective or nonconforming hardware and paperwork is important to both defense contractors and the government. Management's concern of excessive costs in this area initiated an investigation into the actual activities, personnel, and computer systems involved in such processing. Applicable military specifications and an assortment of corporate and divisional procedures were reviewed to obtain baseline data. Additional information was sought through personal interviews and visits to the manufacturing areas. The activity flow was documented in block diagrams and time estimates and labor requirements were applied. The detailed labor estimates were input to a LOTUS123 spreadsheet and used to determine average labor cost per disposition type, i.e., rework, scrap, return-to-vendor, or repair. The spreadsheet facilitates quick cost analysis of proposed management changes to the procedure and system. The estimates were merged with actual distribution of dispositions in an expected cost probability network to identify high cost areas and potential savings. Suggested improvements are evaluated by using the expected cost network and the electronic spreadsheet. Library research on recent publications form industry and academe provide further information in an area rich with potential savings: the white collar worker and quality assurance.
166

Severity of Non-Normality in Pavement Quality Assurance Acceptance Quality Characteristics Data and the Adverse Effects on Acceptance and Pay

Uddin, Mohammad M., Goodrum, Paul M., Mahboub, Kamyar C. 01 January 2011 (has links)
Nonnormality in the form of skewness and kurtosis was examined in lot acceptance quality characteristics data from seven state highway agencies for their highway construction quality assurance programs. Lot skewness and kurtosis varied significantly. For most lot data sets, skewness values varied in the range of 0.0 ± 1.0, whereas most kurtosis values varied in the range of 0.0 ± 2.0. The analysis also reveals that, on average, 50% of lot test data sets were nonnormal with 15% of lot data sets having skewness greater than ±1.0 and kurtosis greater than ±2.0. This is a significant finding because most state transportation agencies' pay factor algorithms assume normally distributed lot. Further investigation showed that high skewness and kurtosis were associated with higher lot variability. This variability produced misleading results in regard to inflated Type I error and low power for the F-test. However, the t-test was found to be quite robust for distinguishing mean differences. Significant deviation was observed in lot pay factors based on percent within limits between assumed normal data and normalized data. Effects of nonnormal distribution on the lot pay factor were found to be varied on the basis of the specification limits, the distribution of defective materials on the tails in the case of two-sided limits, and the orientation of the nonnormal distribution itself.
167

Correlating nano-scale surface replication accuracy and cavity temperature in micro-injection moulding using in-line process control and high-speed thermal imaging

Baruffi, F., Gülçür, Mert,, Calaon, M., Romano, J.-M., Penchev, P., Dimov, S., Whiteside, Benjamin R., Tosello, G. 22 October 2019 (has links)
Yes / Micro-injection moulding (μIM) stands out as preferable technology to enable the mass production of polymeric components with micro- and nano-structured surfaces. One of the major challenges of these processes is related to the quality assurance of the manufactured surfaces: the time needed to perform accurate 3D surface acquisitions is typically much longer than a single moulding cycle, thus making impossible to integrate in-line measurements in the process chain. In this work, the authors proposed a novel solution to this problem by defining a process monitoring strategy aiming at linking sensitive in-line monitored process variables with the replication quality. A nano-structured surface for antibacterial applications was manufactured on a metal insert by laser structuring and replicated using two different polymers, polyoxymethylene (POM) and polycarbonate (PC). The replication accuracy was determined using a laser scanning confocal microscope and its dependence on the variation of the main μIM parameters was studied using a Design of Experiments (DoE) experimental approach. During each process cycle, the temperature distribution of the polymer inside the cavity was measured using a high-speed infrared camera by means of a sapphire window mounted in the movable plate of the mould. The temperature measurements showed a high level of correlation with the replication performance of the μIM process, thus providing a fast and effective way to control the quality of the moulded surfaces in-line. / MICROMAN project (“Process Fingerprint for Zero-defect Net-shape MICRO MANufacturing”, http://www.microman.mek.dtu.dk/) - H2020 (Project ID: 674801), H2020 agreement No. 766871 (HIMALAIA), H2020 ITN Laser4Fun (agreement No. 675063)
168

Advanced Data Analytics for Quality Assurance of Smart Additive Manufacturing

Shen, Bo 07 July 2022 (has links)
Additive manufacturing (AM) is a powerful emerging technology for fabricating components with complex geometries using a variety of materials. However, despite the promising potential, due to the complexity of the process dynamics, how to ensure product quality and consistency of AM parts efficiently during the process remains challenging. Therefore, this dissertation aims to develop advanced machine learning methods for online process monitoring and quality assurance of smart additive manufacturing. Driven by edge computing, the Industrial Internet of Things (IIoT), sensors and other smart technologies, data collection, communication, analytics, and control are infiltrating every aspect of manufacturing. The data provides excellent opportunities to improve and revolutionize manufacturing for both quality and productivity. Despite the massive volume of data generated during a very short time, approximately 90 percent of data gets wasted or unused. The goal of sensing and data analytics for advanced manufacturing is to capture the full insight that data and analytics can discover to help address the most pressing problems. To achieve the above goal, several data-driven approaches have been developed in this dissertation to achieve effective data preprocessing, feature extraction, and inverse design. We also develop related theories for these data-driven approaches to guarantee their performance. The performances have been validated using sensor data from AM processes. Specifically, four new methodologies are proposed and implemented as listed below: 1. To make the unqualified thermal data meet the spatial and temporal resolution requirement of microstructure prediction, a super resolution for multi-sources image stream data using smooth and sparse tensor completion is proposed and applied to data acquisition of additive manufacturing. The qualified thermal data is able to extract useful information like boundary velocity, thermal gradient, etc. 2. To effectively extract features for high dimensional data with limited samples, a clustered discriminant regression is created for classification problems in healthcare and additive manufacturing. The proposed feature extraction method together with classic classifiers can achieve better classification performance than the convolutional neural network for image classification. 3. To extract the melt pool information from the processed X-ray video in metal AM process, a smooth sparse Robust Tensor Decomposition model is devised to decompose the data into the static background, smooth foreground, and noise, respectively. The proposed method exhibits superior performance in extracting the melt pool information on X-ray data. 4. To learn the material property for different printing settings, a multi-task Gaussian process upper confidence bound is developed for the sequential experiment design, where a no-regret algorithm is implemented. The proposed algorithm aims to learn the optimal material property for different printing settings. By fully utilizing the sensor data with innovative data analytics, the above-proposed methodologies are used to perform interdisciplinary research, promote technical innovations, and achieve balanced theoretical/practical advancements. In addition, these methodologies are inherently integrated into a generic framework. Thus, they can be easily extended to other manufacturing processes, systems, or even other application areas such as healthcare systems. / Doctor of Philosophy / Additive manufacturing (AM) technology is rapidly changing the industry, and data from various sensors and simulation software can further improve AM product quality. The objective of this dissertation is to develop methodologies for process monitoring and quality assurance using advanced data analytics. In this dissertation, four new methodologies are developed to address the problems of unqualified data, high dimensional data with limited samples, and inverse design. Related theories are also studied to identify the conditions by which the performance of the developed methodologies can be guaranteed. To validate the effectiveness and efficiency of proposed methodologies, various data sets from sensors and simulation software are used for testing and validation. The results demonstrate that the proposed methods are promising for different AM applications. The future applications of the accomplished work in this dissertation are not just limited to AM. The developed methodologies can be easily transferred for applications in other domains such as healthcare, computer vision, etc.
169

Compressive Sensing Approaches for Sensor based Predictive Analytics in Manufacturing and Service Systems

Bastani, Kaveh 14 March 2016 (has links)
Recent advancements in sensing technologies offer new opportunities for quality improvement and assurance in manufacturing and service systems. The sensor advances provide a vast amount of data, accommodating quality improvement decisions such as fault diagnosis (root cause analysis), and real-time process monitoring. These quality improvement decisions are typically made based on the predictive analysis of the sensor data, so called sensor-based predictive analytics. Sensor-based predictive analytics encompasses a variety of statistical, machine learning, and data mining techniques to identify patterns between the sensor data and historical facts. Given these patterns, predictions are made about the quality state of the process, and corrective actions are taken accordingly. Although the recent advances in sensing technologies have facilitated the quality improvement decisions, they typically result in high dimensional sensor data, making the use of sensor-based predictive analytics challenging due to their inherently intensive computation. This research begins in Chapter 1 by raising an interesting question, whether all these sensor data are required for making effective quality improvement decisions, and if not, is there any way to systematically reduce the number of sensors without affecting the performance of the predictive analytics? Chapter 2 attempts to address this question by reviewing the related research in the area of signal processing, namely, compressive sensing (CS), which is a novel sampling paradigm as opposed to the traditional sampling strategy following the Shannon Nyquist rate. By CS theory, a signal can be reconstructed from a reduced number of samples, hence, this motivates developing CS based approaches to facilitate predictive analytics using a reduced number of sensors. The proposed research methodology in this dissertation encompasses CS approaches developed to deliver the following two major contributions, (1) CS sensing to reduce the number of sensors while capturing the most relevant information, and (2) CS predictive analytics to conduct predictive analysis on the reduced number of sensor data. The proposed methodology has a generic framework which can be utilized for numerous real-world applications. However, for the sake of brevity, the validity of the proposed methodology has been verified with real sensor data associated with multi-station assembly processes (Chapters 3 and 4), additive manufacturing (Chapter 5), and wearable sensing systems (Chapter 6). Chapter 7 summarizes the contribution of the research and expresses the potential future research directions with applications to big data analytics. / Ph. D.
170

Quality system implementation in Hong Kong industries.

January 1996 (has links)
by Wong, Tony Ton. / Thesis (M.B.A.)--Chinese University of Hong Kong, 1996. / Includes bibliographical references (leaves 78-80). / Chapter CHAPTER I --- INTRODUCTION --- p.1 / Chapter CHAPTER II --- PROJECT BACKGROUND --- p.3 / BRIEF HISTORY OF QUALITY MANAGEMENT IN INDUSTRIALIZED NATIONS --- p.5 / PHILOSOPHIES OF QUALITY MANAGEMENT --- p.6 / BRIEF HISTORY OF MANUFACTURING IN HONG KONG --- p.8 / SOCIO-ECONOMIC SIGNIFICANCE OF THE MANUFACTURING SECTOR --- p.9 / TRENDS AND OPPORTUNITIES --- p.10 / HISTORICAL ACCOUNT OF QUALITY MANAGEMENT IN THE MANUFACTURING SECTOR --- p.13 / Chapter CHAPTER III --- METHODOLOGY --- p.18 / SELECTION OF SURVEY TARGETS --- p.18 / QUESTIONNAIRE DESIGN --- p.20 / COMPANY PROFILES --- p.20 / IMPLEMENTATION STAGES AND COMMON TECHNIQUES --- p.21 / ACQUISITION OF QUALITATIVE DATA --- p.22 / CONDUCTING THE SURVEY --- p.22 / Chapter CHAPTER IV --- SURVEY RESULT ANALYSIS --- p.26 / COMPANY PROFILES --- p.27 / INDUSTRY OF THE SURVEY PARTICIPANTS --- p.27 / LOCATIONS OF MANUFACTURING FACILITIES --- p.30 / PRODUCTION PROCESSES --- p.31 / EMPLOYEE SIZE --- p.32 / EDUCATION LEVELS OF EMPLOYEES --- p.34 / IMPORTANCE OF WORKER SKILL LEVELS --- p.34 / IMPORTANCE OF STANDARD PROCEDURE --- p.35 / CRITICAL EXTERNAL FACTORS --- p.36 / INTRA-COMPANY COMMUNICATION --- p.37 / QUALITY MANAGEMENT SYSTEM IMPLEMENTATION AND TECHNIQUES --- p.39 / DRIVING FORCE FOR QUALITY IMPROVEMENT --- p.39 / MANAGEMENT STYLE --- p.40 / MEDIUM TERM CORPORATE OBJECTIVES --- p.42 / HISTORY OF FORMAL QUALITY MANAGEMENT PROGRAM --- p.43 / QUALITY IMPROVEMENT METHODOLOGIES --- p.44 / APPLICATION OF SQC TECHNIQUES --- p.47 / QUALITY IMPROVEMENT TEAMS --- p.48 / TRAINING FOR QUALITY IMPROVEMENT --- p.51 / MOTIVATION TECHNIQUES --- p.53 / ACCREDITATION ON ISO´ؤ9000 SERIES STANDARD --- p.54 / MAJOR OBSTACLES TO QUALITY IMPROVEMENT --- p.56 / STAGES OF QUALITY MANAGEMENT SYSTEM IMPLEMENTATION --- p.58 / MANAGEMENT STYLE AND QUALITY OBJECTIVES --- p.60 / IMPACT OF ISO-9000 ACCREDITATION ON QUALITY PRIORITIES --- p.61 / Chapter CHAPTER V --- CONCLUSIONS AND RECOMMENDATIONS --- p.64 / CONCLUSIONS --- p.64 / RECOMMENDATIONS AND FUTURE WORK --- p.67 / FURTURE WORK --- p.67 / APPENDICES AND OTHER ATTACHMENTS --- p.69 / APPENDIX 1 - COVER LETTER DESIGN --- p.70 / APPENDIX 2 - SAMPLE SURVEY QUESTIONNAIRE --- p.71 / APPENDIX 3 - MEISTER'S TEN LESSONS ON TRAINING --- p.75 / APPENDIX 4 - HYPOTHESIS TEST ON THE DIFFERENCE BETWEEN TWO POPULATION PROPORTIONS --- p.76 / BIBLIOGRAPHY AND REFERENCES --- p.78

Page generated in 0.114 seconds