• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 6
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 69
  • 69
  • 17
  • 16
  • 15
  • 12
  • 12
  • 12
  • 9
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Condition Assessment of Cemented Materials Using Ultrasonic Surface Waves

Kirlangic, Ahmet Serhan 10 July 2013 (has links)
Mechanical waves provide information about the stiffness and the condition of a medium; thus, changes in medium conditions can be inferred from changes in wave velocity and attenuation. Non-destructive testing (NDT) methods based on ultrasonic waves are often more economical, practical and faster than destructive testing. Multichannel analysis of surface waves (MASW) is a well-established surface wave method used for determination of the shear-wave profile of layered medium. The MASW test configuration is also applicable to assess the condition of concrete elements using appropriate frequency range. Both attenuation and dispersion of ultrasonic waves can be evaluated by this technique. In ultrasonic testing, the characterization of a medium requires the precise measurement of its response to ultrasonic pulses to infer the presence of defects and boundary conditions. However, any ultrasonic transducer attached to a surface affects the measured response; especially at high frequencies. On the other hand, ultrasonic transducers available for engineering application are mostly used to measure wave velocities (travel time method). Therefore, these transducers do not have a flat response in the required frequency range. Moreover, in the case of full-waveform methods, the recorded signals should be normalized with respect to the transfer functions of the transducers to obtain the real response of the tested specimen. The main objective of this research is to establish a comprehensive methodology based on surface wave characteristics (velocity, attenuation and dispersion) for condition assessment of cemented materials with irregular defects. To achieve the major objective, the MASW test configuration is implemented in the ultrasonic frequency range. The measured signals are subjected to various signal processing techniques to extract accurate information. In addition, a calibration procedure is conducted to determine the frequency response functions (FRF) of the piezoelectric accelerometers outside their nominal frequency range. This calibration is performed using a high-frequency laser vibrometer. This research includes three main studies. The first study introduces the calibration approach to measure the FRFs of the accelerometers outside of their flat frequency range. The calibrated accelerometers are then used to perform MASW tests on a cemented-sand medium. The original signals and the corrected ones by eliminating the effect of the FRFs are used to determine material damping of the medium. Although, the damping ratios obtained from different accelerometers are not same, the values from the corrected signals are found closer to the characteristic damping value compared to those from the uncorrected signals. The second study investigates the sensitivity of Rayleigh wave velocity, attenuation coefficient, material damping and dispersion in phase velocity to evaluate the sensitivity of these characteristics to the damage quantity in a medium. The soft cemented-sand medium is preferred as the test specimen so that well-defined shaped defects could be created in the medium. MASW test configuration is implemented on the medium for different cases of defect depth. The recorded signals are processed using different signal processing techniques including Fourier and wavelet transforms and empirical mode decomposition to determine the surface wave characteristics accurately. A new index, ‘dispersion index’, is introduced which quantifies the defect based on the dispersive behaviour. All surface wave characteristics are found capable of reflecting the damage quantity of the test medium at different sensitivity levels. In the final study, the condition assessment of six lab-scale concrete beams with different void percent is performed. The beam specimens involving Styrofoam pellets with different ratios are tested under ultrasonic and mechanical equipment. The assessment produce established in the second study with well-defined defects is pursed for the beams with irregular defects. Among the characteristics, attenuation, P and R-wave velocities and dispersion index are found as the promising characteristics for quantifying the defect volume.
22

Measurements of conductive film

Samano, Anthony January 2017 (has links)
Printed electronics is a combination of electronics and printing technologies commonly used in the publication industry such as screen, inkjet, and roll to roll printing. The measurements of conductive film particularly the conductive paste is the main objective of this thesis. The conductive paste consists of conductive filler, adhesive and solvent. Each component affects the electrical, and mechanical properties of the finished conductive film product. The measurements of conductive film have three field of study. The first category is the lifetime performance measurement of conductive film using environmental testing. A screen printed carbon, silver and a developmental paste were categorised to environmental testing and third harmonic measurement. The second category is the measurement AC Impedance and DC resistance of conductive ink during cure. During the curing of the pastes, the AC impedance and DC resistance were monitored. A LabVIEW program was developed to control the AC impedance analyser, DC resistance ohmmeter, and convection oven. Samples were measured whilst curing at different curing temperatures and for a range of particle loadings. Particle loading is the percentage of conductive filler against the rest of the chemical in the conductive paste. The last category was defect detection using the combination of electromechanical testing, a Scanning Electron Microscope (SEM) and an Infrared (IR) imaging technique. Printed carbon and silver were mechanically aged by bending the printed structure up to 100 k times. The results from the lifetime performance measurements on carbon, silver and the developmental paste showed the polymer resin behaviour in high humidity and high temperature environments. The increased oxidation rate due to the elevated temperatures affected the conductive particle of certain pastes. The third harmonic testing technique was able to detect failures on conductive film in the form of width reduction. The AC impedance measurement technique could indicate the final resistivity value. The AC impedance measurement was affected by the test frequency used while the ink is in liquid state. Correct test frequency setting will have less noise and less impedance value, vital in predicting the final cured resistance of the printed paste. The curing temperature affects the final cured resistance value while the particle loading affects the rate of curing of conductive film. The electrical measurement on mechanically aged samples showed that the carbon prints have its resistance readings below its initial value while the silver prints resistance increased. SEM images shows that the carbon print indicates no visual damage on the surface after 100 k bent cycle, while physical defects were observed in silver prints. The infrared measurements on carbon prints showed an increase in temperature while developments of heat patches were observed on silver prints. Difference in emissivity values of materials used provided the contrast effect which plays an important role in detecting defect using infrared imaging technique because. Third harmonic application to the printed electronics is new to this field. Normally, testing is done using environmental testing to determine the lifetime performance of the conductive film. This is effective however requires a lot of time and effort to produce a result. AC Impedance is used widely and the application can be seen on cured printed electronics. The application and measurement of AC impedance during cure and DC resistance measurement has indicated initial resistivity values. The measurement has further the effect of using AC impedance on different curing temperature and particle loading. The phase measurement as well has brought insight of degree of curing. The application of infra-red imaging technique to the mechanically aged device has produced a result that DC resistance and SEM imaging failed to detect. Normally DC resistance measurement was used as quality assessment tool but test shows on mechanically aged product failed to detect increase in resistance due to mechanical aging techqnique.
23

The microwave properties of tissue and other lossy dielectrics

Alabaster, Clive M. January 2004 (has links)
This thesis describes work on the theoretical modelling and experimental measurement of the complex permittivity of dielectrics. The main focus of research has been into the characterisation of permittivity of planar and layered samples within the millimetre wave band. The measurement method is based on the free-space measurement of the transmission and reflection coefficients of samples. A novel analytical method of determining the transmission and reflection coefficients as functions of frequency arising from a generalised structure of planar dielectric layers is also described and validated. The analytical method is based on signal flow techniques. The measurement and analytical techniques have been applied in two main areas: firstly, the acquisition of new data on human skin in the band 57 to 100GHz and secondly, the detection and location of defects in composite materials for which a band of 90 to 100GHz was used. Measurements have been made on the complex permittivity of a single sample of excised human skin fixed in formaldehyde. The experimental results have been corrected to account for the fixing process in formaldehyde and are projected to body temperature. This data is, to the best of the author’s knowledge, the first of its kind to be published. Predicted skin permittivity based on various relaxation models varies widely and only partially fits the measured data. The experimental results have been used to determine the parameters of a Cole-Cole function which gives the best fit to the measured data. The measured skin data has also been used to calculate power deposition in skin exposed to millimetre wave radiation. This work concludes that a skin surface temperature rise of only 0.20C results from a thirty second exposure to signals of 100W/m2. Experimental work with fibreglass composite samples has shown that defects such as delaminations, voids, matrix cracks and improper cure result in resolvable differences in the dielectric properties of the samples at 90 – 100GHz. The measurement technique is particularly sensitive to the detection of cracks and its spatial resolution is 20mm or better. Whilst confirming the general conclusions of previously published work, the specific findings of this study are novel.
24

The Microwave properties of tissue and other lossy dielectrics

Alabaster, C M 03 1900 (has links)
This thesis describes work on the theoretical modelling and experimental measurement of the complex permittivity of dielectrics. The main focus of research has been into the characterisation of permittivity of planar and layered samples within the millimetre wave band. The measurement method is based on the free-space measurement of the transmission and reflection coefficients of samples. A novel analytical method of determining the transmission and reflection coefficients as functions of frequency arising from a generalised structure of planar dielectric layers is also described and validated. The analytical method is based on signal flow techniques. The measurement and analytical techniques have been applied in two main areas: firstly, the acquisition of new data on human skin in the band 57 to 100GHz and secondly, the detection and location of defects in composite materials for which a band of 90 to 100GHz was used. Measurements have been made on the complex permittivity of a single sample of excised human skin fixed in formaldehyde. The experimental results have been corrected to account for the fixing process in formaldehyde and are projected to body temperature. This data is, to the best of the author’s knowledge, the first of its kind to be published. Predicted skin permittivity based on various relaxation models varies widely and only partially fits the measured data. The experimental results have been used to determine the parameters of a Cole-Cole function which gives the best fit to the measured data. The measured skin data has also been used to calculate power deposition in skin exposed to millimetre wave radiation. This work concludes that a skin surface temperature rise of only 0.20C results from a thirty second exposure to signals of 100W/m2. Experimental work with fibreglass composite samples has shown that defects such as delaminations, voids, matrix cracks and improper cure result in resolvable differences in the dielectric properties of the samples at 90 – 100GHz. The measurement technique is particularly sensitive to the detection of cracks and its spatial resolution is 20mm or better. Whilst confirming the general conclusions of previously published work, the specific findings of this study are novel.
25

Validation of Machine Learning and Visualization based Static Code Analysis Technique / Validering av Machine Learning and Visualization bygger statisk kod analysteknik

Mahmood, Waqas, Akhtar, Muhammad Faheem January 2009 (has links)
Software security has always been an afterthought in software development which results into insecure software. Companies rely on penetration testing for detecting security vulnerabilities in their software. However, incorporating security at early stage of development reduces cost and overhead. Static code analysis can be applied at implementation phase of software development life cycle. Applying machine learning and visualization for static code analysis is a novel idea. Technique can learn patterns by normalized compression distance NCD and classify source code into correct or faulty usage on the basis of training instances. Visualization also helps to classify code fragments according to their associated colors. A prototype was developed to implement this technique called Code Distance Visualizer CDV. In order test the efficiency of this technique empirical validation is required. In this research we conduct series of experiments to test its efficiency. We use real life open source software as our test subjects. We also collected bugs from their corresponding bug reporting repositories as well as faulty and correct version of source code. We train CDV by marking correct and faulty version of code fragments. On the basis of these trainings CDV classifies other code fragments as correct or faulty. We measured its fault detection ratio, false negative and false positive ratio. The outcome shows that this technique is efficient in defect detection and has low number of false alarms. / Software trygghet har alltid varit en i efterhand inom mjukvaruutveckling som leder till osäker mjukvara. Företagen är beroende av penetrationstester för att upptäcka säkerhetsproblem i deras programvara. Att införliva säkerheten vid tidigt utvecklingsskede minskar kostnaderna och overhead. Statisk kod analys kan tillämpas vid genomförandet av mjukvaruutveckling livscykel. Tillämpa maskininlärning och visualisering för statisk kod är en ny idé. Teknik kan lära mönster av normaliserade kompressionständning avstånd NCD och klassificera källkoden till rätta eller felaktig användning på grundval av utbildning fall. Visualisering bidrar också till att klassificera code fragment utifrån deras associerade färger. En prototyp har utvecklats för att genomföra denna teknik som kallas Code Avstånd VISUALISERARE CDV. För att testa effektiviteten hos denna teknik empirisk validering krävs. I denna forskning vi bedriver serie experiment för att testa dess effektivitet. Vi använder verkliga livet öppen källkod som vår test ämnen. Vi har också samlats in fel från deras motsvarande felrapportering förråd samt fel och rätt version av källkoden. Vi utbildar CDV genom att markera rätt och fel version av koden fragment. På grundval av dessa träningar CDV klassificerar andra nummer fragment som korrekta eller felaktiga. Vi mätt sina fel upptäckt förhållandet falska negativa och falska positiva förhållandet. Resultatet visar att den här tekniken är effektiv i fel upptäckt och har låga antalet falsklarm. / waqasmah@gmail.com +46762316108
26

Effectiveness of Exploratory Testing, An empirical scrutiny of the challenges and factors affecting the defect detection efficiency / Effektiviteten hos utforskande testning, En empirisk granskning av de utmaningar och faktorer som påverkar effektiviteten fel upptäckt

Bhatti, Khurram, Ghazi, Ahmad Nauman January 2010 (has links)
Context: Software testing is an integral part of software development life cycle. To improve the quality of software there are different testing approaches practiced over the years. Traditionally software testing is carried out by following approach focusing on prior test design. While exploratory testing is an approach to test software where the tester does not require to follow a specific test design. But rather, exploratory testing should facilitate the tester in testing the complete system comprehensively. Exploratory testing is seen by some, as a way to conduct simultaneous learning, test design and execution of tests simultaneously. While others point to exploratory testing enabling constant evolution of tests in an easy manner. Objectives: In this study we have investigated the field of exploratory testing in literature and industry to understand its perception and application. Further among the stated claims by practitioners, we selected defect detection efficiency and effectiveness claim for empirical validation through an experiment and survey. Methods: In this study, a systematic literature review, interview, experiment and survey are conducted. In the systematic review a number of article sources are used, including IEEE Xplore, ACM Digital Library, Engineering village, Springer Link, Google Scholar and Books database. The systematic review also includes the gray literature published by the practitioners. The selection of studies was done using two-phase and tollgate approach. A total of 47 references were selected as primary studies. Eight semi-structures interviews were conducted with industry practitioners. Experiment had total 4 iterations and 70 subjects. The subjects were selected from industry and academia. The experimental design used was one factor with two interventions and one response variable. Results: Based on our findings from literature review and interviews, the understanding of exploratory testing has improved over the period but still lacks empirical investigation. The results drawn from experimental and survey data shows that exploratory testing proved effective and efficient in finding more critical bugs in limited time. Conclusions: We conclude that exploratory testing has a lot of potential and much more to offer to testing industry. But more empirical investigation and true facts and figures are required to motivate the testing industry to adapt it. We have reported a number of advantages, disadvantages, challenges and factors in this study. We further investigated the claims stated by the ET practitioners through an experiment and survey. The statistical tests were conducted on the collected data to draw meaningful results. We found statistical significance difference in number of true defects found. Using exploratory testing approach testers found far more defects than test case based testing. Although, there was no statistical significance difference between the two approaches for false defects. / Slutsatser: Vi anser att det experimentella tester har stor potential och mycket mer att erbjuda testning industrin. Men mer empirisk undersökning och sann fakta och siffror är skyldiga att motivera testning industrin att anpassa den. Vi har rapporterat en rad fördelar, nackdelar, utmaningar och faktorer i denna studie. Vi undersökte vidare fordringar anges av ET utövare genom ett experiment och undersökning. De statistiska test genomfördes på insamlade data för att dra meningsfulla resultat. Vi fann statistisk signifikans skillnaden i antalet sann fel som upptäcks. Använda utforskande testning strategi testare fann långt fler fel än testfall baserat testning. Även om det inte fanns någon statistisk signifikans skillnad mellan de två synsätten för falska defekter. / 0046 73 651 8048
27

Intelligent Code Inspection using Static Code Features : An approach for Java

Moriggl, Irene January 2010 (has links)
Effective defect detection is still a hot issue when it comes to software quality assurance. Static source code analysis plays thereby an important role, since it offers the possibility for automated defect detection in early stages of the development. As detecting defects can be seen as a classification problem, machine learning is recently investigated to be used for this purpose. This study presents a new model for automated defect detection by means of machine learn- ers based on static Java code features. The model comprises the extraction of necessary features as well as the application of suitable classifiers to them. It is realized by a prototype for the feature extraction and a study on the prototype’s output in order to identify the most suitable classifiers. Finally, the overall approach is evaluated in a using an open source project. The suitability study and the evaluation show, that several classifiers are suitable for the model and that the Rotation Forest, Multilayer Perceptron and the JRip classifier make the approach most effective. They detect defects with an accuracy higher than 96%. Although the approach comprises only a prototype, it shows the potential to become an effective alternative to nowa- days defect detection methods.
28

Zpracování obrazu s velkými datovými toky - využití CUDA/OpenCL / High data rate image processing using CUDA/OpenCL

Sedláček, Filip January 2018 (has links)
The main objective of this research is to propose optimization of the defect detection algorithm in the production of nonwoven textile. The algorithm was developed by CAMEA spol. s.r.o. As a consequence of upgrading the current camera system to a more powerful one, it will be necessary to optimize the current algorithm and choose the hardware with the appropriate architecture on which the calculations will be performed. This work will describe a usefull programming techniques of CUDA software architecture and OpenCL framework in details. Using these tools, we proposed to implement a parallel equivalent of the current algorithm, describe various optimization methods, and we designed a GUI to test these methods.
29

Automatizace vizuální kontroly jaderného paliva / Automation of nuclear fuel visual inspection

Knotek, Jaroslav January 2020 (has links)
The safety and performance of nuclear plant relies, among others, on the quality of nuclear fuel. The quality fulfilling designed criteria of the fuel in use is inspected and reported on periodically. Visual inspection focuses on the condition of the fuel based on its visual properties. During the inspection, the fuel is being recorded and analysed by the inspector. The current state of the fuel assemblies is compared to the historical statistics which helps do decide whether this particular assembly remains or gets replaced. This thesis describe a project initiated by Centrum Výzkumu Řež focusing on digital image processing methods application to visual inspection process. The result of the project is a tool that accelerates the process of report making. Firstly, it transforms the inspection video into one image overview and highlight a significant part (more than 95%) of possible defects to the inspector. 1
30

Machine Learning on Acoustic Signals Applied to High-Speed Bridge Deck Defect Detection

Chou, Yao 06 December 2019 (has links)
Machine learning techniques are being applied to many data-intensive problems because they can accurately provide classification of complex data using appropriate training. Often, the performance of machine learning can exceed the performance of traditional techniques because machine learning can take advantage of higher dimensionality than traditional algorithms. In this work, acoustic data sets taken using a rapid scanning technique on concrete bridge decks provided an opportunity to both apply machine learning algorithms to improve detection performance and also to investigate the ways that training of neural networks can be aided by data augmentation approaches. Early detection and repair can enhance safety and performance as well as reduce long-term maintenance costs of concrete bridges. In order to inspect for non-visible internal cracking (called delaminations) of concrete bridges, a rapid inspection method is needed. A six-channel acoustic impact-echo sounding apparatus is used to generate large acoustic data sets on concrete bridge decks at high speeds. A machine learning data processing architecture is described to accurately detect and map delaminations based on the acoustic responses. The machine learning approach achieves accurate results at speeds between 25 and 45 km/h across a bridge deck and successfully demonstrates the use of neural networks to analyze this type of acoustic data. In order to obtain excellent performance, model training generally requires large data sets. However, in many potentially interesting cases, such as bridge deck defect detection, acquiring enough data for training can be difficult. Data augmentation can be used to increase the effective size of the training data set. Acoustic signal data augmentation is demonstrated in conjunction with a machine learning model for acoustic defect detection on bridge decks. Four different augmentation methods are applied to data using two different augmentation strategies. This work demonstrates that a "goldilocks" data augmentation approach can be used to increase machine learning performance when only a limited data set is available. The major technical contributions of this work include application of machine learning to acoustic data sets relevant to bridge deck inspection, solving an important problem in the field of nondestructive evaluation, and a more generalized approach to data augmentation of limited acoustic data sets to expand the classes of acoustic problems that machine learning can successfully address.

Page generated in 0.0969 seconds