• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 5
  • 5
  • 2
  • 1
  • 1
  • Tagged with
  • 33
  • 33
  • 9
  • 7
  • 7
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Application of Spectral Decomposition Analysis to In Vivo Quantification of Aluminum / In Vivo Quantification of Aluminum

Daria, Cosma 09 1900 (has links)
Aluminum is a non-essential trace element that accumulates in human bone tissue (Nayak, 2002). Its toxic effects are cumulative and result in painful forms of renal osteodystrophy, most notably a dynamic bone disease and osteomalacia, but also other forms of disease (Yokel, 2001; Cannata-Andia, 2002). Presently, histological tests of bone biopsies are the only approach for the diagnosis of aluminum-related pathologies (Malluche, 2002). Neutron Activation Analysis was proposed as an alternative method for quantifying aluminum. The Trace Element Group at McMaster University has developed an in vivo procedure for detecting aluminum levels in the bones of the hand, exploiting an accelerator-based approach. A minimum detectable limit (MDL) of 1.14mg of aluminum could be distinguished for a local dose to the hand of 48mSv (Pejovic-Milic, 2001). For the procedure to be clinically effective, the MDL should be comparable to the levels normally contained in healthy subjects (0.3-0.4 mg AI). Further refining of the method is therefore necessary. This dissertation presents an improved algorithm for data analysis, based on Spectral Decomposition. Following phantom measurements, a new MDL of(0.7±0.1)mg AI was reached for a local dose of (20±1)mSv, representing an improvement by a factor of 1.60±0.04. In addition, a time-dependent variant of this algorithm was proposed. The study also addresses the feasibility of a new data acquisition technique, the electronic rejection of the coincident events detected by the Nai(Tl) system. It is expected that the application of this technique, together with Spectral Decomposition Analysis, would provide an acceptable MDL for the method to be valuable in a clinical setting. / Thesis / Master of Science (MS)
2

An experimental and numerical study of an automotive cooling module

Rynell, Anders January 2017 (has links)
Heavy vehicles are major emitters of noise. Especially at idle or low vehicle speeds a large portion of the noise emanates from the fan that forces the flow through the cooling module. The aim of this work is to investigate and reveal aerodynamic and acoustic installation effects linked to the cooling package. This introduces a multidisciplinary approach involving examination of the flow field, sound generation and sound propagation. The work includes two main parts: an experimental and a numerical part. The cooling module used throughout this work, named reduced cooling module, primarily includes a radiator, a shroud, a fan and a hydraulic engine to simplify the aeroacoustics analysis. The experimental part comprises measurements of the sound emanated from the cooling package. A new approach to the spectral decomposition method is developed yielding the fan sound power or spectrum to be formulated as a product of a source part and a system part scaling with the Strouhal number and the Helmholtz number. Also, a separate determination of the transmission loss of the radiator is performed. The impact of the radiator on the transmitted noise was found to be negligible. The numerical part incorporates comparisons from two aeroacoustics studies; a configuration where the fan is forced to operate at a fixed operation point and measured flow and turbulence statistics are available and the reduced cooling module. A hybrid turbulence modeling technique, IDDES, is adopted for the flow simulations. The sound propagation is calculated by the Ffowcs-Williams and Hawkings acoustic analogy when assuming a free-field sound propagation and by a finite element solver in the frequency domain to capture the installation effects. The simulated SPL conforms to the measured SPL and the blade response to the turbulent inflow and to the tip resolution, respectively, produce noise which spectral shape distribution is modified in accordance with earlier experimental findings published. Furthermore, the influence of an upstream radiator in close contact with the fan on the flow and sound fields is investigated. Here, the simulated aeroacoustic characteristics were found to change similarly to the acoustic measurements with and without radiator.
3

Spectral Decomposition Using S-transform for Hydrocarbon Detection and Filtering

Zhang, Zhao 2011 August 1900 (has links)
Spectral decomposition is a modern tool that utilizes seismic data to generate additional useful information in seismic exploration for hydrocarbon detection, lithology identification, stratigraphic interpretation, filtering and others. Different spectral decomposition methods with applications to seismic data were reported and investigated in past years. Many methods usually do not consider the non-stationary features of seismic data and, therefore, are not likely to give satisfactory results. S-transform developed in recent years is able to provide time-dependent frequency analysis while maintaining a direct relationship with the Fourier spectrum, a unique property that other methods of spectral decomposition may not have. In this thesis, I investigated the feasibility and efficiency of using S-transform for hydrocarbon detection and time-varying surface wave filtering. S-transform was first applied to two seismic data sets from a clastic reservoir in the North Sea and a deep carbonate reservoir in the Sichuan Basin, China. Results from both cases demonstrated that S-transform decomposition technique can detect hydrocarbon zones effectively and helps to build the relationships between lithology changes and high frequency variation and between hydrocarbon occurrence and low-frequency anomaly. However, its time resolution needs to be improved. In the second part of my thesis, I used S-transform to develop a novel Time-frequency-wave-number-domain (T-F-K) filtering method to separate surface wave from reflected waves in seismic records. The S-T-F-K filtering proposed here can be used to analyze surface waves on separate f-k panels at different times. The method was tested using hydrophone records of four-component seismic data acquired in the shallow-water Persian Gulf where the average water depth is about 10m and Scholte waves and other surfaces wave persistently strong. Results showed that this new S-T-F-K method is able to separate and sttenuate surface waves and to improve greatly the quality of seismic reflection signals that are otherwise completely concealed by the aliased surface waves.
4

Pricing Basket Default Swap with Spectral Decomposition

Chen, Pei-kang 01 June 2007 (has links)
Cholesky Decomposition is usually used to deal with the correlation problem among a financial product's underlying assets. However, Cholesky Decomposition inherently suffers from the requirement that all eigenvalues must be positive. Therefore, Cholesky Decomposition can't work very well when the number of the underlying assets is high. The report takes a diffrent approach called spectral Decomposition in attempt to solve the problem. But it turns out that although Spectral Decomposition can meet the requirement of all-positive eigenvalue, the decomposision error will be larger as the number of underlying asset getting larger. Thus, although Spectral Decomposition does offer some help, it works better when the number of underlying assets is not very large.
5

Contributions to watermarking of 3D meshes/Contributions au tatouage des maillages surfaciques 3D

Cayre, François 09 December 2003 (has links)
We present two watermarking schemes for 3D meshes : - watermarking with geometrical invariant for fragile watermarking towards authentication and integrity purposes - watermarking in the geometrical spectral domain towards robust watermarking / Nous présentons deux schémas de tatouage pour maillages surfaciques 3D : - tatouage fragile par invariants géométriques pour l'authentification et l'intégrité - tatouage robuste dans l'espace de la décomposition spectrale
6

Frequency dependent seismic reflection analysis: a path to new direct hydrocarbon indicators for deep water reservoirs

Yoo, Seung Chul 02 June 2009 (has links)
To better study frequency related effects such as attenuation and tuning, we developed a frequency dependent seismic reflection analysis. Comprehensive tests on full waveform synthetics and observations from the Teal South ocean bottom seismic (OBS) data set confirmed that normal moveout (NMO) stretch could distort both frequency and amplitude information severely in shallow events and far offset traces. In synthetic tests, our algorithm recovered amplitude and frequency information ac-curately. This simple but robust target oriented NMO stretch correction scheme can be used on top of an existing seismic processing flow for further analyses. By combining the NMO stretch correction, spectral decomposition, and crossplots of am-plitude versus offset (AVO) attributes, we tested the frequency dependent workflow over Teal south and Ursa field data sets for improved reservoir characterization. As expected from NMO stretch characteristics, low frequencies have been less affected while mid and high frequency ranges were affected considerably. In seismic attribute analysis, the AVO crossplots from spectrally decomposed prestack data confirmed the improved accuracy and effectiveness of our workflow in mid and high frequency regions. To overcome poor spectral decomposition results due to low signal to noise ratio (S/N) in the Teal South application, we also implemented a substack scheme that stacks adjacent traces to increase S/N ratio while reducing the amount of data to process and increasing the accuracy of the spectral decomposition step. Synthetic tests verified the effectiveness of this additional step. An application to the Ursa, Gulf of Mexico, deep water data set showed significant improvement in high frequency data while correcting biased low frequency information.
7

Frequency dependent seismic reflection analysis: a path to new direct hydrocarbon indicators for deep water reservoirs

Yoo, Seung Chul 02 June 2009 (has links)
To better study frequency related effects such as attenuation and tuning, we developed a frequency dependent seismic reflection analysis. Comprehensive tests on full waveform synthetics and observations from the Teal South ocean bottom seismic (OBS) data set confirmed that normal moveout (NMO) stretch could distort both frequency and amplitude information severely in shallow events and far offset traces. In synthetic tests, our algorithm recovered amplitude and frequency information ac-curately. This simple but robust target oriented NMO stretch correction scheme can be used on top of an existing seismic processing flow for further analyses. By combining the NMO stretch correction, spectral decomposition, and crossplots of am-plitude versus offset (AVO) attributes, we tested the frequency dependent workflow over Teal south and Ursa field data sets for improved reservoir characterization. As expected from NMO stretch characteristics, low frequencies have been less affected while mid and high frequency ranges were affected considerably. In seismic attribute analysis, the AVO crossplots from spectrally decomposed prestack data confirmed the improved accuracy and effectiveness of our workflow in mid and high frequency regions. To overcome poor spectral decomposition results due to low signal to noise ratio (S/N) in the Teal South application, we also implemented a substack scheme that stacks adjacent traces to increase S/N ratio while reducing the amount of data to process and increasing the accuracy of the spectral decomposition step. Synthetic tests verified the effectiveness of this additional step. An application to the Ursa, Gulf of Mexico, deep water data set showed significant improvement in high frequency data while correcting biased low frequency information.
8

Investigation of time-lapse 4D seismic tuning and spectral responses to CO₂-EOR for enhanced characterization and monitoring of a thin carbonate reservoir

Krehel, Austin January 1900 (has links)
Master of Science / Department of Geology / Abdelmoneam Raef / Advancements, applications, and success of time-lapse (4D) seismic monitoring of carbonate reservoirs is limited by these systems’ inherent heterogeneity and low compressibility relative to siliciclastic systems. To contribute to the advancement of 4D seismic monitoring in carbonates, an investigation of amplitude envelope across frequency sub-bands was conducted on a high-resolution 4D seismic data set acquired in fine temporal intervals between a baseline and eight monitor surveys to track CO₂-EOR from 2003-2005 in the Hall-Gurney Field, Kansas. The shallow (approximately 900 m) Plattsburg ‘C Zone’ target reservoir is an oomoldic limestone within the Lansing-Kansas City (LKC) supergroup – deposited as a sequence of high-frequency, stacked cyclothems. The LKC reservoir fluctuates around thin-bed thickness within the well pattern region and is susceptible to amplitude tuning effects, in which CO₂ replacement of initial reservoir fluid generates a complex tuning phenomena with reduction and brightening of amplitude at reservoir thickness above and below thin-bed thickness, respectively. A thorough analysis of horizon snapping criteria and parameters was conducted to understand the sensitivity of these autonomous operations and produce a robust horizon tracking workflow to extend the Baseline Survey horizon data to subsequent Monitor Surveys. This 4D seismic horizon tracking workflow expedited the horizon tracking process across monitor surveys, while following a quantitative, repeatable approach in tracking the LKC and maintaining geologic integrity despite low signal-to-noise ratio (SNR) data and misties between surveys. Analysis of amplitude envelope data across frequency sub-bands (30-80 Hz) following spectral decomposition identified geometric features of multiple LKC shoal bodies at the reservoir interval. In corroboration with prior geologic interpretation, shoal boundaries, zones of overlap between stacked shoals, thickness variation, and lateral changes in lithofacies were delineated in the Baseline Survey, which enhanced detail of these features’ extent beyond capacity offered from well log data. Lineaments dominated by low-frequency anomalies within regions of adjacent shoals’ boundaries suggest thicker zones of potential shoal overlap. Analysis of frequency band-to-band analysis reveals relative thickness variation. Spectral decomposition of the amplitude envelope was analyzed between the Baseline and Monitor Surveys to identify spectral and tuning changes to monitor CO₂ migration. Ambiguity of CO₂ effects on tuning phenomena was observed in zones of known CO₂ fluid replacement. A series of lineaments highlighted by amplitude brightening from the Baseline to Monitor Surveys is observed, which compete with a more spatially extensive effect of subtle amplitude dimming. These lineaments are suggestive of features below tuning thickness, such as stratigraphic structures of shoals, fractures, and/or thin shoal edges, which are highlighted by an increased apparent thickness and onset of tuning from CO₂. Detailed analysis of these 4D seismic data across frequency sub-bands provide enhanced interpretation of shoal geometry, position, and overlap; identification of lateral changes in lithofacies suggestive of barriers and conduits; insight into relative thickness variation; and the ability of CO₂ tuning ambiguity to highlight zones below tuning thickness and improve reservoir characterization. These results suggest improved efficiency of CO₂ -EOR reservoir surveillance in carbonates, with implications to ensure optimal field planning and flood performance for analogous targets.
9

Perception and re-synchronization issues for the watermarking of 3D shapes

Rondao Alface, Patrice 26 October 2006 (has links)
Digital watermarking is the art of embedding secret messages in multimedia contents in order to protect their intellectual property. While the watermarking of image, audio and video is reaching maturity, the watermarking of 3D virtual objects is still a technology in its infancy. In this thesis, we focus on two main issues. The first one is the perception of the distortions caused by the watermarking process or by attacks on the surface of a 3D model. The second one concerns the development of techniques able to retrieve a watermark without the availability of the original data and after common manipulations and attacks. Since imperceptibility is a strong requirement, assessing the visual perception of the distortions that a 3D model undergoes in the watermarking pipeline is a key issue. In this thesis, we propose an image-based metric that relies on the comparison of 2D views with a Mutual Information criterion. A psychovisual experiment has validated the results of this metric for the most common watermarking attacks. The other issue this thesis deals with is the blind and robust watermarking of 3D shapes. In this context, three different watermarking schemes are proposed. These schemes differ by the classes of 3D watermarking attacks they are able to resist to. The first scheme is based on the extension of spectral decomposition to 3D models. This approach leads to robustness against imperceptible geometric deformations. The weakness of this technique is mainly related to resampling or cropping attacks. The second scheme extends the first to resampling by making use of the automatic multiscale detection of robust umbilical points. The third scheme then addresses the cropping attack by detecting robust prong feature points to locally embed a watermark in the spatial domain.
10

The Construction and Application of Hybrid Factor Model

Tao, Yun-jhen 28 July 2010 (has links)
A Multifactor model is used to explain asset return and risk and its explanatory power depends on common factors that the model uses. Researchers strive to find reasonable factors to enhance multifactor model¡¦s efficiency. However, there are still some unknown factors to be discovered. Miller (2006) presents a general concept and structure of hybrid factor model. The study follows the idea of Miller (2006) and aims to build a complete flow of constructing hybrid factor model that is based on fundamental factor model and statistical factor models. We also apply the hybrid factor model to the Taiwan stock market. We assume that a fundamental factor model is already developed and therefore this study focuses on building the second stage, statistical factor model. Principal Component Analysis is used to form statistical factor and spectral decomposition is used to prepare data for principal component analysis. Those methods are applied to stocks on the Taiwan Stock Exchange in the period of January 1, 2000 to December 31, 2009. This study presents a complete construction flow of hybrid factor models and further confirms that a hybrid factor model is able to find missing factors in a developing market such as Taiwan¡¦s stock market. The study also discovers that the missing factors might be market factor and extensive electronic industry factor.

Page generated in 0.1268 seconds