• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 120
  • 43
  • 42
  • 41
  • 41
  • 41
  • 19
  • 13
  • 11
  • 8
  • 7
  • 6
  • 5
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Design and development of a time of flight fast scattering spectrometer : a quantitative surface analysis technique and a new approach towards the experimental investigation of the surface particle interactions

Giles, Roger January 1995 (has links)
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
22

The influence of stress on the detectability of fatigue cracks using ultrasonics

Ibrahim, Sobhi I. January 1981 (has links)
No description available.
23

A study of cluster analysis techniques and their applications

Hafidh, Saad M. A. January 1981 (has links)
This thesis seeks to describe the development of an inexpensive and efficient clustering technique for multivariate data analysis. The technique starts from a multivariate data matrix and ends with graphical representation of the data and pattern recognition discriminant function. The technique also results in distances frequency distribution that might be useful in detecting clustering in the data or for the estimation of parameters useful in the discrimination between the different populations in the data. The technique can also be used in feature selection. The technique is essentially for the discovery of data structure by revealing the component parts of the data. lhe thesis offers three distinct contributions for cluster analysis and pattern recognition techniques. The first contribution is the introduction of transformation function in the technique of nonlinear mapping. The second contribution is the us~ of distances frequency distribution instead of distances time-sequence in nonlinear mapping, The third contribution is the formulation of a new generalised and normalised error function together with its optimal step size formula for gradient method minimisation. The thesis consists of five chapters. The first chapter is the introduction. The second chapter describes multidimensional scaling as an origin of nonlinear mapping technique. The third chapter describes the first developing step in the technique of nonlinear mapping that is the introduction of "transformation function". The fourth chapter describes the second developing step of the nonlinear mapping technique. This is the use of distances frequency distribution instead of distances time-sequence. The chapter also includes the new generalised and normalised error function formulation. Finally, the fifth chapter, the conclusion, evaluates all developments and proposes a new program. for cluster analysis and pattern recognition by integrating all the new features.
24

Soil classification through dynamic soil signatures

Yeow, Hoe Chian January 1990 (has links)
The demand for a cost effective site investigation method has resulted in the introduction of various advanced in-situ testing techniques. These techniques utilise modern electronics instrumentation to monitor various soil parameters during site investigation. The data is then processed using high speed, low cost digital computers which allow an accurate and rapid assessment of the conditions of the foundation soil under a proposed construction site. In this thesis, a site investigation tool that drives a coring tube into the ground under a combination of vibration and impact is considered. This machine, called a vibro-impact corer, is fully instrumented to provide penetrometer-type information and a core sample for further inspection in the laboratory. The self-adjusting mechanism inherent in this machine delivers the minimum level of energy required to overcome soil resistance which thereby allows continuous penetration of the coring tube. This mechanism is also results in minimal induced disturbance during the coring process. This thesis investigates use of the vibro-impact corer as a soil classification tool. It involves the design of data analysis software to perform the soil classification procedure. Due to the nature of the system, the resistance monitored through the annular load cell fitted at the tip of the coring tube consists of the dynamic end resistance waveform and the peak magnitude of these waveforms over a sampling period. The vibro-impact soil classification system is based on the distinct self-adjusting mechanism of the machine. The self-adjustment mechanism imparts a different level of impact and vibration as soil conditions change which produces distinct dynamic soil resistance waveforms. In addition, the penetration rate and the magnitude of the soil resistance encountered also varies according to the material being penetrated. These two features are used to form the basis of the soil classification system in this software. The software also includes options for empirical correlation of the results obtained from the vibro-impact penetrometer with the CPT and SPT tests to allow comparison. The vibro-impact soil classification software is designed to be user-friendly. It reads the data files from a Biodata Transient Capture System for the classification process. The output devices such as plotter and printer are used to produce hardcopy records for various data. All the options are menu driven. A two degree freedom simulation of the operational responses of the vibro-impact machine is also included in this thesis. The main objective of this simulation is to study the soil response during vibro-impact mode of driving. This allows a comparison of the simulation soil responses to the model test results to provide an understanding of the soil behaviour under a combination of both vibratory and impact loadings. This thesis presents the results of several laboratory model and full scale vibro-impact penetrometer tests. It concentrates on the main subject of soil classification during the discussions although in some occasions the operational mechanism of the machine is mentioned. The results justify the approaches adopted for the soil classification system using a vibro-impact machine.
25

Impact of blow-moulded high density polyethylene containers

Lim, Bah Chin January 1986 (has links)
No description available.
26

Viscosity measurements at pressures up to 14,000 bar using an automatic falling cylinder viscometer

Irving, John Bruce January 1977 (has links)
The thesis describes a new method for measuring the viscosity of liquids in a pressure vessel capable of reaching 14 000 bar, and results are presented for six liquids at 30°C, up to viscosities of 3000 P. The technique is based on the well-tried principle of a cylindrical sinker falling in a viscometer tube. It departs from earlier systems in that the sinker is retrieved electromagnetically rather than by rotating the whole pressure vessel, and the sinker is held by a semi-permanent magnet before a fall time measurement is made. The sinkers do not have guiding pins, but rely on self-centering forces to ensure concentric fall. Another novel aspect is that a sinker with a central hole to produce faster fall times has been introduced for the first time. An analysis for such a sinker is presented, and when the diameter of the hole is mathematically reduced to zero, the equation of motion for the solid sinker is obtained. The solution for the solid cylinder is compared with earlier approximate analyses. The whole cycle of operation - retrieval, holding, releasing, sinker detection, and recording is remotely controlled and entirely automated. With unguided falling weights it is essential that the viscometer tube is aligned vertically. The effects of non-vertical alignment are assessed both experimentally and theoretically. An original analysis is presented to explain the rather surprising finding that when a viscometer tube is inclined from the vertical, the sinker falls much more quickly. The agreement between experiment and theory is to within one per cent. From the analysis of sinker motion, appropriate allowances for the change in sinker and viscometer tube dimensions under pressure are calculated; these are substantially linear with pressure. The viscometer was calibrated at atmospheric pressure with a variety of liquids whose viscosities were ascertained with calibrated suspended-level viscometers. Excellent linearity over three decades of viscosity was found for both sinkers. A careful analysis of errors shows that the absolute accuracy of measurement is to within ±1.8 per cent. The fall time of the sinker is also a function of the buoyancy of the test liquid. Therefore a knowledge of the liquid density is required, both at atmospheric pressure and at elevated pressures. The linear differential transformer method for density measurement formed the basis of a new apparatus designed to fit into the high pressure vessel. Up to pressures of 5 kbar measurements are estimated to be within ±0.14 per cent, and above this pressure uncertainty could be as high as 0.25 per cent. The last chapter deals with empirical and semi-theoretical viscosity-pressure equations. Two significant contributions are offered. The first is a new interpretation of the free volume equation in which physically realistic values of the limiting specific volume, vo, are derived by applying viscosity and density data to the equation iso-barically, not isothermally as most have done in the past. This led to a further simplification of the free volume equation to a two constant equation. The second contribution is a purely empirical equation which describes the variation of viscosity as a function of pressure: ln(η/ηo)t = A(eBP - e-KP) where no is the viscosity at atmospheric pressure, and A, B and K are constants. This 'double-exponential' equation is shown to describe data to within experimental error for viscosities which vary by as much as four decades with pressure. It also describes the different curvatures which the logarithm of viscosity exhibits when plotted as a function of pressure: concave towards the pressure axis, convex, straight line, or concave and then convex. The many other equations in existence cannot describe this variety of behaviour.
27

Vision based systems for hardness testing and NDT

Smith, Ian Colin January 1990 (has links)
The work presented in this thesis concerns the development of vision based systems for two hardness (destructive) tests, namely; the Shore and Vickers and a quality assurance non-destructive test. In each case the vision system is based on an IBM PC compatible computer fitted with a commercially available frame store. Bespoke image analysis software was written using the C language for each system. In the Shore test, hardness is judged by the maximum rebound height attained by an indenter incident on a test sample. The purpose of the vision system is to measure the rebound height automatically. Laser light is used to illuminate the indenter and a vidicon vision camera is used to view its motion. Two approaches to the problem are considered; one in which image data is analysed in real time and one in which image·data is merely stored in real time and analysed a posteriori. Non-real time analysis is shown to be superior to real time analysis in terms of accuracy and reliablity and its software implementation is discussed in detail. The Vickers test uses the size of the permanent impression left by an indenter forced into the test material under a known load as a hardness index. In this case the purpose of the vision system is to measure the size of the indentation automatically. The original image analysis algorithms are shown to be capable of analysing good quality samples but are unreliable when applied to poor quality specimens. Further, fault-tolerant, algorithms are described to provide reliable and accurate results over wide variations in sample quality.The quality assurance application involves automated visual inspection of novel ferrite components for defects. Each component is approximately 8 mm in diameter, annular in shape, and coated with aluminium. Laser light is used to illuminate individual components which arc viewed using a charge-coupled device (CCD) video camera. Image analysis algorithms for characterising defects in component geometry and surface finish arc discussed. The system is shown to capable of measuring component edge eccentricity and hole offset as well as providing a quantitative description of surface chips and cracks. The system is further shown to be capable of separately classifying surface defects extending to the edge of a component. Calculation of shape parameters for surface defects also provides a means of distinguishing cracks from surface chips.
28

A recording and presentation system for manual ultrasonic inspections using a speech recognition interface

Smith, P. January 1998 (has links)
Reliability and repeatability are fundamental concepts in ultrasonic nondestructive testing. An inspection technique must be able to accurately detect, characterize, position and size any defect indication. In manual ultrasonic inspection, however, the operator can be a frequent source of error. Mistakes often arise due to the volume of information the operator must memorize and process. Existing solutions require mechanical probe manipulators that restrict the operator's movements and often require changes to trusted methods and procedures. The aim of this research programme was to investigate the potential of a computer system that assists the operator in the analysis of echodynamic patterns. The system allows the operator to record A-scan sequences, store them to disk, and recall them for review. The system's flexible user interface gives the operator freedom to retain existing inspection practices, in addition to the benefits of computer recording. A novel feature of the system was a speech recognition system to provide hands-free control, which minimizes disruption to the flow of the inspection. Trials were conducted to assess the recognizer's reliability under various conditions. The trials showed that focusing upon echodynamic pattern analysis is a valid and useful approach. Only a limited trial was conducted, however, so the research program was not able to conclusively show that the system will reduce operator errors or improve inspection reliability. In user testing sessions, operators agreed that such a system would be helpful during a manual inspection and there were few objections to the imposition of new hardware. Users quickly became used to the speech recognizer, and the speed of interaction and 'flow' of inspection were greatly enhanced. The author suggests that a computerized assistant is worthy of further development, and has the potential to be a valuable tool in manual ultrasonic inspection.
29

Multivariate outlier detection in laboratory safety data

Penny, Kay Isabella January 1995 (has links)
Clinical laboratory safety data consist of a wide range of biochemical and haematological variables which are collected to monitor the safety of a new treatment during a clinical trial. Although the data are multivariate, testing for abnormal measurements is usually done for only one variable at a time. A Monte Carlo simulation study is described, which compares 16 methods, some of which are new, for detecting multivariate outliers with a view to finding patients with an unusual set of laboratory measurements at a follow-up assessment. Multivariate normal and bootstrap simulations are used to create data sets of various dimensions. Both symmetrical and asymmetrical contamination are considered in this study. The results indicate that in addition to the routine univariate methods, it is desirable to run a battery of multivariable methods on laboratory safety data in an attempt to highlight possible outliers. Mahalanobis distance is a well-known criterion which is included in the study. Appropriate critical values when testing for a single multivariate outlier using Mahalanobis Distance are derived in this thesis, and the jack-knifed Mahalanobis distance is also discussed. Finally, the presence of missing data in laboratory safety data sets is the motivation behind a study which compares eight multiple imputation methods. The multiple imputation study is described, and the performance of two outlier detection methods in the presence of three different proportions of missing data are discussed. Measures are introduced for assessing the accuracy of the missing data results, depending on which method of analysis is used.
30

The evaluation of a squid based non-contact magnetic NDE technique for application to the inspection of offshore steel structures

Evanson, S. January 1988 (has links)
No description available.

Page generated in 0.0448 seconds