• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 319
  • 157
  • 107
  • 22
  • 16
  • 11
  • 8
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 752
  • 322
  • 249
  • 128
  • 126
  • 72
  • 65
  • 61
  • 61
  • 58
  • 56
  • 39
  • 39
  • 38
  • 37
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Application of Stereo Imaging to Atomic Force Microscopy

Aumond, Bernardo D., Youcef-Toumi, Kamal 01 1900 (has links)
Metrological data from sample surfaces can be obtained by using a variety of profilometry methods. Atomic Force Microscopy (AFM), which relies on contact inter-atomic forces to extract topographical images of a sample, is one such method that can be used on a wide range of surface types, with possible nanometer range resolution. However, AFM images are commonly distorted by convolution, which reduces metrological accuracy. This type of distortion is more significant when the sample surface contains high aspect ratio features such as lines, steps or sharp edges - structures commonly found in semiconductor devices and applications. Aiming at mitigating these distortions and recovering metrology soundness, we introduce a novel image deconvolution scheme based on the principle of stereo imaging. Multiple images of a sample, taken at different angles, allow for separation of convolution artifacts from true topographic data. As a result, perfect sample reconstruction and probe shape estimation can be achieved in certain cases. Additionally, shadow zones, which are areas of the sample that cannot be probed by the AFM, are greatly reduced. Most importantly, this technique does not require a priori probe characterization. It also reduces the need for slender or sharper probes, which, on one hand, induce less convolution distortion but, on the other hand, are more prone to wear and damage, thus decreasing overall system reliability. / Singapore-MIT Alliance (SMA)
102

Estimation of human height from surveillance camera footage - a reliability study

Ljungberg, Jenny, Sönnerstam, Johanna January 2008 (has links)
Abstract Aim: The aim was to evaluate height measurements made with the single view metrology method and to investigate the influence of standing position and different phases of gait and running on vertical height. Method: Ten healthy men were recorded simultaneously by a 2D web camera and a 3D motion analysis system. They performed six trials, three standing and three during gait and running. The vertical height was measured with the single view metrology method and in Qualisys Track Manager. The results were compared for evaluation. The vertical height in the different postures was compared to the actual height. Results: The measurements made with the single view metrology method were significantly higher than the measurements made with Qualisys Track Manager (p<0.001). The vertical height in the two standing positions was significantly lower than the actual height (p<0.05). The vertical height in midstance was significantly lower than actual height in the walking trials (p<0.05). No significant difference was found between maximum vertical height and actual height during running (p>0.05). Conclusion: The single view metrology method measured vertical heights with a mean error of +2.30 cm. Posture influence vertical body height. Midstance in walking is the position where vertical height corresponds best with actual height, in running it is the non-support phase.
103

Estimation of human height from surveillance camera footage - a reliability study

Ljungberg, Jenny, Sönnerstam, Johanna January 2008 (has links)
<p><p><strong>Abstract</strong></p><p><strong>Aim: </strong>The aim was to evaluate height measurements made with the single view metrology method and to investigate the influence of standing position and different phases of gait and running on vertical height.</p><p><strong>Method: </strong>Ten healthy men were recorded simultaneously by a 2D web camera and a 3D motion analysis system. They performed six trials, three standing and three during gait and running. The vertical height was measured with the single view metrology method and in Qualisys Track Manager. The results were compared for evaluation. The vertical height in the different postures was compared to the actual height.</p><p><strong>Results: </strong>The measurements made with the single view metrology method were significantly higher than the measurements made with Qualisys Track Manager (p<0.001). The vertical height in the two standing positions was significantly lower than the actual height (p<0.05). The vertical height in midstance was significantly lower than actual height in the walking trials (p<0.05). No significant difference was found between maximum vertical height and actual height during running (p>0.05).</p><p><strong>Conclusion: </strong>The single view metrology method measured vertical heights with a mean error of +2.30 cm. Posture influence vertical body height. Midstance in walking is the position where vertical height corresponds best with actual height, in running it is the non-support phase.</p><p> </p></p><p> </p>
104

High Resolution Optical Surface Metrology with the Slope Measuring Portable Optical Test System

Maldonado, Alejandro V. January 2014 (has links)
New optical designs strive to achieve extreme performance, and continually increase the complexity of prescribed optical shapes, which often require wide dynamic range and high resolution. SCOTS, or the Software Configurable Optical Test System, can measure a wide range of optical surfaces with high sensitivity using surface slope. This dissertation introduces a high resolution version of SCOTS called SPOTS, or the Slope measuring Portable Optical Test System. SPOTS improves the metrology of surface features on the order of sub-millimeter to decimeter spatial scales and nanometer to micrometer level height scales. Currently there is no optical surface metrology instrument with the same utility. SCOTS uses a computer controlled display (such as an LCD monitor) and camera to measure surface slopes over the entire surface of a mirror. SPOTS differs in that an additional lens is placed near the surface under test. A small prototype system is discussed in general, providing the support for the design of future SPOTS devices. Then the SCOTS instrument transfer function is addressed, which defines the way the system filters surface heights. Lastly, the calibration and performance of larger SPOTS device is analyzed with example measurements of the 8.4-m diameter aspheric Large Synoptic Survey Telescope's (LSST) primary mirror. In general optical systems have a transfer function, which filters data. In the case of optical imaging systems the instrument transfer function (ITF) follows the modulation transfer function (MTF), which causes a reduction of contrast as a function of increasing spatial frequency due to diffraction. In SCOTS, ITF is shown to decrease the measured height of surface features as their spatial frequency increases, and thus the SCOTS and SPOTS ITF is proportional to their camera system's MTF. Theory and simulations are supported by a SCOTS measurement of a test piece with a set of lithographically written sinusoidal surface topographies. In addition, an example of a simple inverse filtering technique is provided. The success of a small SPOTS proof of concept instrument paved the way for a new larger prototype system, which is intended to measure subaperture regions on large optical mirrors. On large optics, the prototype SPOTS is light weight and it rests on the surface being tested. One advantage of this SPOTS is stability over time in maintaining its calibration. Thus the optician can simply place SPOTS on the mirror, perform a simple alignment, collect measurement data, then pick the system up and repeat at a new location. The entire process takes approximately 5 to 10 minutes, of which 3 minutes is spent collecting data. SPOTS' simplicity of design, light weight, robustness, wide dynamic range, and high sensitivity make it a useful tool for optical shop use during the fabrication and testing process of large and small optics.
105

Surface Metrology of Contact Lenses in Saline Solution

Heideman, Kyle C. January 2014 (has links)
Measurement of the quality and performance of soft contact lenses is not new and is continually evolving as manufacturing methods develop and more complicated contact lenses become available. Qualification of soft contact lenses has not been a simple task since they are fundamentally difficult to measure. The shape of the lens is extremely sensitive to how the lens is supported and the material properties can change quickly with time. These lenses have been measured in several different ways, the most successful being non-contact optical methods that measure the lens while it is immersed in saline solution. All of these tests measure the lens in transmission and do not directly measure the surface structure of the lens. The reason for this is that the Fresnel reflectivity of the surface of a contact lens in saline solution is about 0.07%. Surface measurements have been performed in air, but not in saline. The lens needs to be measured in solution so that it can maintain its true shape. An interferometer is proposed, constructed, verified, and demonstrated to measure the aspheric low reflectivity surfaces of a contact lens while they are immersed in saline solution. The problem is extremely difficult and requires delicate balance between stray light mitigation, color correction, and polarization management. The resulting system implements reverse raytracing algorithms to correct for retrace errors so that highly aspheric, toric, and distorted contact lens surfaces can be measured. The interferometer is capable of measuring both surfaces from the same side of the contact lens as well as the lens thickness. These measurements along with the index of refraction of the lens material are enough build a complete 3D model of the lens. A simulated transmission test of the 3D model has been shown to match the real transmission test of the same lens to within 32nm RMS or 1/20th of a wave at the test wavelength.
106

Asphercial Metrology for Non-Specular Surfaces with the Scanning Long-Wave Optical Test System

Su, Tianquan January 2014 (has links)
Aspherical optics are increasingly used these days. The application of aspherical surfaces on large, astronomical telescope mirrors brings challenge to the fabrication. Since the surface radius of curvature varies across the surface, the grinding/polishing tool needs to change its shape when working on different parts of the surface, making surface error more easily embedded into the surface. Therefore, a tighter test-fab loop is needed to guide the fabrication process. To maximize the accuracy during the grinding of the surface and to minimize the working time in the polishing stage, a better metrology device that can measure rough surface is needed to guide the grinding process. Scanning long-wave optical test system (SLOTS) is designed to meet this demand by providing accurate, fast, large dynamic range, and high spatial resolution measurements on rough optical surfaces (surface rms roughness<1.7 µm).SLOTS is a slope measuring deflectometry system that works like a reversed wire test. It measures the reflection of the infrared light off the test surface, and calculates the local slope of the test surface. The surface sag/height is obtained through integration. During the test, a heated metal ribbon radiates long-wave infrared light that is reflected by the test surface. A thermal imaging camera records the reflected light. The ribbon is scanned in two orthogonal directions. From the variation of the irradiance recorded by the camera, slope maps of the test surface can be retrieved in the two orthogonal directions. SLOTS is a combination of tradition slope measurement and modern technology, processing advantages from both parts. It measures surface slope, so there is no need for null optics. It uses an uncooled thermal imaging camera that is made with high resolution and high sensitivity. The linear stage used to scan the hot ribbon has long travel, small resolution, and high accuracy. Both the camera and stage enable SLOTS a large dynamic range and high sensitivity. SLOTS has successfully guided the grinding process of the primary mirror of Daniel K. Inouye Solar Telescope. This mirror is a 4-meter diameter off-axis parabola (OAP). Its largest aspherical departure is 8 mm. SLOTS is able to measure it without any null optics. Under the guidance of SLOTS, the surface shape was controlled to be 1 µm rms within designed shape (with astigmatism removed) at 0.7 µm rms surface roughness (12 µm loose abrasive grits).
107

On the Squeezing and Over-squeezing of Photons

Shalm, Lynden Krister 31 August 2011 (has links)
Quantum mechanics allows us to use nonclassical states of light to make measurements with a greater precision than comparable classical states. Here an experiment is presented that squeezes the polarization state of three photons. We demonstrate the deep connection that exists between squeezing and entanglement, unifying the squeezed state and multi-photon entangled state approaches to quantum metrology. For the first time we observe the phenomenon of over-squeezing where a system is squeezed to the point that further squeezing leads to a counter-intuitive increase in measurement uncertainty. Quasi-probability distributions on the surface of a Poincaré sphere are the most natural way to represent the topology of our polarization states. Using this representation it is easy to observe the squeezing and over-squeezing behaviour of our photon states. Work is also presented on two different technologies for generating nonclassical states of light. The first is based on the nonlinear process of spontaneous parametric downconversion to produce pairs of photons. With this source up to 200,000 pairs of photons/s have been collected into single-mode fibre, and over 100 double pairs/s have been detected. This downconversion source is suitable for use in a wide variety of multi-qubit quantum information applications. The second source presented is a single-photon source based on semiconductor quantum dots. The single-photon character of the source is verified using a Hanbury Brown-Twiss interferometer.
108

On the Squeezing and Over-squeezing of Photons

Shalm, Lynden Krister 31 August 2011 (has links)
Quantum mechanics allows us to use nonclassical states of light to make measurements with a greater precision than comparable classical states. Here an experiment is presented that squeezes the polarization state of three photons. We demonstrate the deep connection that exists between squeezing and entanglement, unifying the squeezed state and multi-photon entangled state approaches to quantum metrology. For the first time we observe the phenomenon of over-squeezing where a system is squeezed to the point that further squeezing leads to a counter-intuitive increase in measurement uncertainty. Quasi-probability distributions on the surface of a Poincaré sphere are the most natural way to represent the topology of our polarization states. Using this representation it is easy to observe the squeezing and over-squeezing behaviour of our photon states. Work is also presented on two different technologies for generating nonclassical states of light. The first is based on the nonlinear process of spontaneous parametric downconversion to produce pairs of photons. With this source up to 200,000 pairs of photons/s have been collected into single-mode fibre, and over 100 double pairs/s have been detected. This downconversion source is suitable for use in a wide variety of multi-qubit quantum information applications. The second source presented is a single-photon source based on semiconductor quantum dots. The single-photon character of the source is verified using a Hanbury Brown-Twiss interferometer.
109

Distortion in conformable masks for evanescent near field optical lithography

Wright, Alan James January 2007 (has links)
In this thesis the in-plane pattern distortion resulting from the use of Evanescent Near Field Optical Lithography (ENFOL) masks was investigated. ENFOL is a high resolution low-cost technique of lithography that is able to pattern features beyond the diffraction limit of light. Due to its use of the evanescent near field, ENFOL requires the use of conformable masks for intimate contact. Such masks can stretch and skew as they come into contact with silicon substrates and therefore distort the high resolution features patterned on them. It was desired to measure this distortion to ascertain the patterning performance of ENFOL masks and possibly correct for any uniform distortion found. To this end a sophisticated measuring process was successfully demonstrated. This involved the use of a Raith 150 Electron Beam Lithography (EBL) system with precision laser interferometer stage and metrology software module for automated measurements. Custom software was written for the Raith to enable it to take additional measurements to compensate for electron beam drift. Processing algorithms were then employed to using the measurements to compensate for beam drift and correcting for shift and rotation systematic errors. The performance of the in-plane distortion measuring process was found to have a precision of 60nm. With the ability to measure distortion, ENFOL masks were used to pattern substrates and distortion was found to be large, on the order of 1µm. This is much larger than desired for sub 100nm patterning as is expected of ENFOL. The distortions were non-uniform patterns of localised displacements. This, the observation of Newton's rings beneath a test mask and the observation of a single particle distortion across measurements of the same mask across different loadings in the EBL pointed to particulate contamination causing the distortion. In order to prove beyond doubt that particulate contamination was the cause of the spurious distortions, mechanical modelling using the Finite Element Method (FEM) of analysis was employed. The results from this matched the distortions observed experimentally, particles 20-40µm modelling the observed distortion.
110

The Epistemology of Measurement: A Model-based Account

Tal, Eran 07 January 2013 (has links)
Measurement is an indispensable part of physical science as well as of commerce, industry, and daily life. Measuring activities appear unproblematic when performed with familiar instruments such as thermometers and clocks, but a closer examination reveals a host of epistemological questions, including: 1. How is it possible to tell whether an instrument measures the quantity it is intended to? 2. What do claims to measurement accuracy amount to, and how might such claims be justified? 3. When is disagreement among instruments a sign of error, and when does it imply that instruments measure different quantities? Currently, these questions are almost completely ignored by philosophers of science, who view them as methodological concerns to be settled by scientists. This dissertation shows that these questions are not only philosophically worthy, but that their exploration has the potential to challenge fundamental assumptions in philosophy of science, including the distinction between measurement and prediction. The thesis outlines a model-based epistemology of physical measurement and uses it to address the questions above. To measure, I argue, is to estimate the value of a parameter in an idealized model of a physical process. Such estimation involves inference from the final state (‘indication’) of a process to the value range of a parameter (‘outcome’) in light of theoretical and statistical assumptions. Idealizations are necessary preconditions for the possibility of justifying such inferences. Similarly, claims to accuracy, error and quantity individuation can only be adjudicated against the background of an idealized representation of the measurement process. Chapters 1-3 develop this framework and use it to analyze the inferential structure of standardization procedures performed by contemporary standardization bureaus. Standardizing time, for example, is a matter of constructing idealized models of multiple atomic clocks in a way that allows consistent estimates of duration to be inferred from clock indications. Chapter 4 shows that calibration is a special sort of modeling activity, i.e. the activity of constructing and testing models of measurement processes. Contrary to contemporary philosophical views, the accuracy of measurement outcomes is properly evaluated by comparing model predictions to each other, rather than by comparing observations.

Page generated in 0.0569 seconds