• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 21
  • 8
  • 8
  • 4
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 73
  • 33
  • 11
  • 11
  • 11
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

亞洲四小龍匯率報酬率尾部參數變化之探討

薛承志 Unknown Date (has links)
一般而言財務資料具有高峰(High Kurtosis)及厚尾(Heavy Tail)的特性,極值理論(Extreme Value Theorem)即是著重於尾部極端事件發生的機率,描繒出尾部極端值的機率分配,以捕捉財務資料中具厚尾的現象,利用估算尾部指數(Tail Index) α值判斷尾部分配的厚、薄程度。一般在估算α值時均是假設α值是不會隨著時間而變動的穩定值,然而在我們所選取的樣本期間內,可能伴隨著一些重大事件,如金融風暴、或是制度面的改變等,均有可能造成尾部極端值發生機率的增加或減少,因此在其樣本期間所估算的α值不應假設為一不變的常數。本文即是針對亞洲四小龍的匯率資料做”尾部參數是否發生結構變化(Structural Change)”之假設檢定,並且找出發生結構變化的時點。 實証結果發現,在1993~2004年間,亞洲四小龍的匯率報酬率其尾部參數確實有發生結構變化的情形。此結論對於風險管理者而言,必須注意到尾部參數α值應該是一個會隨著時間而改變的值,也就是在估算 值時應該要避開發生結構變化的可能時點,或許應於所要估計的樣本期間先執行尾部參數是否有結構變化的檢定,如此才能更準確的估算α值。
52

Characterization of Impulse Noise and Hazard Analysis of Impulse Noise Induced Hearing Loss using AHAAH Modeling

Wu, Qing 01 August 2014 (has links)
Millions of people across the world are suffering from noise induced hearing loss (NIHL), especially under working conditions of either continuous Gaussian or non-Gaussian noise that might affect human's hearing function. Impulse noise is a typical non-Gaussian noise exposure in military and industry, and generates severe hearing loss problem. This study mainly focuses on characterization of impulse noise using digital signal analysis method and prediction of the auditory hazard of impulse noise induced hearing loss by the Auditory Hazard Assessment Algorithm for Humans (AHAAH) modeling. A digital noise exposure system has been developed to produce impulse noises with peak sound pressure level (SPL) up to 160 dB. The characterization of impulse noise generated by the system has been investigated and analyzed in both time and frequency domains. Furthermore, the effects of key parameters of impulse noise on auditory risk unit (ARU) are investigated using both simulated and experimental measured impulse noise signals in the AHAAH model. The results showed that the ARUs increased monotonically with the peak pressure (both P+ and P-) increasing. With increasing of the time duration, the ARUs increased first and then decreased, and the peak of ARUs appeared at about t = 0.2 ms (for both t+ and t-). In addition, the auditory hazard of experimental measured impulse noises signals demonstrated a monotonically increasing relationship between ARUs and system voltages.
53

Detecção de falhas em rolamentos de máquinas rotativas utilizando técnicas de processamentos de sinais / Bearing fault detection in rotating machines using signal processing techniques

Santos, Rodolfo de Sousa [UNESP] 21 July 2017 (has links)
Submitted by RODOLFO DE SOUSA SANTOS null (rodolfosousa4@gmail.com) on 2017-08-24T18:31:09Z No. of bitstreams: 1 TESE _RODOLFO_CORRIGIDA_19_08_2017_Final.pdf: 4285264 bytes, checksum: b5dac391b40121a31b55502fba5c1c43 (MD5) / Approved for entry into archive by Luiz Galeffi (luizgaleffi@gmail.com) on 2017-08-25T16:18:27Z (GMT) No. of bitstreams: 1 santos_rs_dr_guara.pdf: 4285264 bytes, checksum: b5dac391b40121a31b55502fba5c1c43 (MD5) / Made available in DSpace on 2017-08-25T16:18:27Z (GMT). No. of bitstreams: 1 santos_rs_dr_guara.pdf: 4285264 bytes, checksum: b5dac391b40121a31b55502fba5c1c43 (MD5) Previous issue date: 2017-07-21 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES) / Os sinais de vibrações de máquinas rotativas conduzem a informações dinâmicas da máquina e esta análise é de grande importância no que diz respeito ao monitoramento de condição e diagnósticos de máquinas. Vários métodos de análises têm sido empregados no sentido de diagnosticar falhas em componentes de máquinas tais como engrenagens, rolamentos, dentre outros. Este trabalho apresenta uma análise sobre detecção de falhas em rolamentos de máquinas rotativas, e para esta apreciação utilizou-se os bancos de dados da CASE WESTERN RESERV UNIVERSITY e o banco de dados da FEG/UNESP. O objetivo principal deste trabalho foi a implementação de técnicas avançadas para identificar e caracterizar as falhas que são geradas em rolamentos, vislumbrando o aprimoramento da manutenção baseada na condição. Inicialmente, realizou-se a implementação e simulação no banco de dados da (CWRU), utilizando o software MATLAB e por meio da técnica de ressonância de alta frequência (HFRT), obteve-se resultados satisfatórios, entretanto esta metodologia é limitada uma vez que ela é empregada apenas para regime estacionário. A implementação da técnica HFRT não identificou em alguns casos a frequências para caracterização dos defeitos nas pistas dos rolamentos. Em seguida, utilizou-se a técnica Short Time Fourier Transform-STFT. A implementação proporcionou uma análise bem mais sensível aos impactos gerados nas pistas, pois, com a utilização da STFT, foi possível identificar as frequências características de defeitos. Para efeito de comparação optou-se por utilizar a técnica Wavelet combinada com a técnica do envelope. Esta análise foi aplicada usando a Wavelet Daubechies de ordem 4 (db4), em cuja implementação, realizou-se a decomposição do sinal de um rolamento com defeito e verificou-se qual destes apresentou o maior nível RMS e selecionou-se este sinal, pois o mesmo é o nível ideal para aplicação do método. Realizou-se a mesma apreciação ao banco de dados da FEG/UNESP. A análise realizada da técnica de Wavelet combinada com a técnica HFRT foi a que demonstrou melhor capacidade em relação às técnicas HFRT e STFT. Em seguida realizou-se a implementação da técnica de curtose espectral associada à técnica do envelope foi a que proporcionou os resultados mais precisos e satisfatórios, pois com a aplicação dessa metodologia foi possível a determinação de forma automática da região de ressonância e consequentemente uma melhora na caracterização das frequências de defeitos observadas nos rolamentos dos experimentos realizados em máquinas rotativas. / The vibration signals from rotating machines provide a set of dynamic information, which are very important for continuous condition monitoring of machinery. Several analytical methods have been employed in order to diagnose faults in machines components such as gears, bearings and others. This paper presents a fault detection analysis of rotating machinery bearings, using data from CASE WESTERN UNIVERSITY RESERVOIR and the FEG / UNESP database. The main objective of this work is the implementation of advanced techniques to identify and characterize bearing failures, with the purpose to improve maintenance under working conditions. At first, the implementation and simulation were done with data extracted from the database of (CWRU) using MATLAB software and high-frequency resonance technique (HFRT), which led to satisfactory results. However, this technique is limited since it is used only in a stationary regime. In some cases, the implementation of HFRT technique was not able to identify the defect frequencies of the bearing’s races. Next the STFT Short-Time Fourier Transform technique was used. Its implementation provided a much more sensitive analysis of the impacts on the slopes; using STFT allowed to identify the characteristic defect frequencies. For comparison purposes, the wavelet technique combined with the envelope technique were used. This analysis was applied using Daubechies Wavelet of order 4 (DB4). In its implementation, a defective bearing signal was decomposed into various parts. The signal part with the highest RMS level was selected, because it provides best conditions for applying the method. Analogously, data from the FEG / UNESP database were treated. The Wavelet analysis technique combined with HFRT technique demonstrated better capability with respect to the HFRT and STFT techniques. The implementation of the spectral kurtosis technique associated with the envelope technique provided the most accurate and satisfactory results, since with the application of this methodology it was possible to determine the resonance region automatically. Consequently, this is an improvement regarding the characterization of the defect frequencies of the bearings observed in experiments with rotating machinery.
54

Implikovaná volatilita a vyšší momenty rizikově neutrálního rozdělení jako předstihové indikátory realizované volatility / Implied volatility and higher risk neutral moments: predictive ability

Hanzal, Martin January 2017 (has links)
Implied volatility obtained from market option prices is widely regarded as an efficient predictor of future realised volatility. Implied volatility can be thought of as market's expectation of future realised volatility. We distinguish between volatility-changing events with respect to expectations - scheduled events (such as information releases) and unscheduled events. We propose a method of testing the information content of option-implied risk-neutral moments prior to volatility-changing events. Using the method introduced by Bakshi, Kapadia & Madan (2003) we extract implied volatility, skewness and kurtosis from S&P 500 options market prices and apply the proposed method in four case studies. Two are concerned with scheduled events - United Kingdom European Union membership referendum, 2016 and United States presidential election, 2016, two are concerned with unscheduled events - flash crash of August 24, 2015 and flash crash of October 15, 2014. Implied volatility indicates a rise in future realised volatility prior to both scheduled events. We find a significant rise in implied kurtosis during the last three days prior to the presidential election of 2016. Prior to unscheduled events, we find no evidence of implied moments indicating a rise in future realised volatility.
55

Développement d'une nouvelle technique de pointé automatique pour les données de sismique réfraction / Development of a new adaptive algorithm for automatic picking of seismic refraction data

Khalaf, Amin 15 February 2016 (has links)
Un pointé précis des temps de premières arrivées sismiques joue un rôle important dans de nombreuses études d’imagerie sismique. Un nouvel algorithme adaptif est développé combinant trois approches associant l’utilisation de fenêtres multiples imbriquées, l’estimation des propriétés statistiques d’ordre supérieur et le critère d’information d’Akaike. L’algorithme a l’avantage d’intégrer plusieurs propriétés (l’énergie, la gaussianité, et la stationnarité) dévoilant la présence des premières arrivées. Tandis que les incertitudes de pointés ont, dans certains cas, d’importance équivalente aux pointés eux-mêmes, l’algorithme fournit aussi automatiquement une estimation sur leur incertitudes. La précision et la fiabilité de cet algorithme sont évaluées en comparant les résultats issus de ce dernier avec ceux provenant d’un pointé manuel, ainsi que d’autres pointeurs automatiques. Cet algorithme est simple à mettre en œuvre et ne nécessite pas de grandes performances informatiques. Cependant, la présence de bruit dans les données peut en dégrader la performance. Une double sommation dans le domaine temporel est alors proposée afin d’améliorer la détectabilité des premières arrivées. Ce processus est fondé sur un principe clé : la ressemblance locale entre les traces stackées. Les résultats montrent l’intérêt qu’il y a à appliquer cette sommation avant de réaliser le pointé automatique. / Accurate picking of first arrival times plays an important role in many seismic studies, particularly in seismic tomography and reservoirs or aquifers monitoring. A new adaptive algorithm has been developed based on combining three picking methods (Multi-Nested Windows, Higher Order Statistics and Akaike Information Criterion). It exploits the benefits of integrating three properties (energy, gaussianity, and stationarity), which reveal the presence of first arrivals. Since time uncertainties estimating is of crucial importance for seismic tomography, the developed algorithm provides automatically the associated errors of picked arrival times. The comparison of resulting arrival times with those picked manually, and with other algorithms of automatic picking, demonstrates the reliable performance of this algorithm. It is nearly a parameter-free algorithm, which is straightforward to implement and demands low computational resources. However, high noise level in the seismic records declines the efficiency of the developed algorithm. To improve the signal-to-noise ratio of first arrivals, and thereby to increase their detectability, double stacking in the time domain has been proposed. This approach is based on the key principle of the local similarity of stacked traces. The results demonstrate the feasibility of applying the double stacking before the automatic picking.
56

Výpočet pokročilých difusních parametrů šedé hmoty mozku z DKI MRI obrazů / Calculation of advanced diffusion parameters in brain grey matter from DKI MRI images

Pánková, Olga January 2019 (has links)
Thesis named Calculation of advanced diffusion parameters in brain grey matter from DKI MRI images deals with processing of diffusion-weighted images from DKI. The thesis contains review of literature on principle of diffusion, influence of diffusion on MRI, calculation of DTI and DKI parameters and clinical application of diffusion-weighted maps with focus on grey matter. The thesis focuses on software tools for processing and pre-processing DTI and DKI. The practical part consisted of two sections. Two different softwares were used to calculate maps of diffusion parameters. Diffusion parameters from anatomical structure sunstantia nigra were compared between group of healthy controls and patients with Parkinson’s disease. This comparison did not show any statisticaly significant difference. In the second step, a script for creating diffusion maps in software Diffusinal Kurtosis Estimator was made.
57

A Study of non-central Skew t Distributions and their Applications in Data Analysis and Change Point Detection.

Hasan, Abeer 26 July 2013 (has links)
No description available.
58

Credit scoring using Logistic regression

Hara Khanam, Iftho January 2023 (has links)
In this thesis, we present the use of logistic regression method to develop a credit scoring modelusing the raw data of 4447 customers of a bank. The data of customers is collected under 14independent explanatory variables and 1 default indicator. The objective of this thesis is toidentify optimal coefficients. In order to clean data, the raw data set was put through variousdata calibration techniques such as Kurtosis, Skewness, Winsorization to eliminate outliers.On this winsorized dataset, LOGIT analysis is applied in two rounds with multiple statisticaltests. These tests aim to estimate the significance of each independent variable and modelfitness. The optimal coefficients can be used to obtain the credit scores for new customers witha new data set and rank them according to their credit risk.
59

Long-Term Ambient Noise Statistics in the Gulf of Mexico

Snyder, Mark Alan 15 December 2007 (has links)
Long-term omni-directional ambient noise was collected at several sites in the Gulf of Mexico during 2004 and 2005. The Naval Oceanographic Office deployed bottom moored Environmental Acoustic Recording System (EARS) buoys approximately 159 nautical miles south of Panama City, Florida, in water depths of 3200 meters. The hydrophone of each buoy was 265 meters above the bottom. The data duration ranged from 10-14 months. The buoys were located near a major shipping lane, with an estimated 1.5 to 4.5 ships per day passing nearby. The data were sampled at 2500 Hz and have a bandwidth of 10-1000 Hz. Data are processed in eight 1/3-octave frequency bands, centered from 25 to 950 Hz, and monthly values of the following statistical quantities are computed from the resulting eight time series of noise spectral level: mean, median, standard deviation, skewness, kurtosis and coherence time. Four hurricanes were recorded during the summer of 2004 and they have a major impact on all of the noise statistics. Noise levels at higher frequencies (400-950 Hz) peak during extremely windy months (summer hurricanes and winter storms). Standard deviation is least in the region 100-200 Hz but increases at higher frequencies, especially during periods of high wind variability (summer hurricanes). Skewness is positive from 25-400 Hz and negative from 630-950 Hz. Skewness and kurtosis are greatest near 100 Hz. Coherence time is low in shipping bands and high in weather bands, and it peaks during hurricanes. The noise coherence is also analyzed. The 14-month time series in each 1/3- octave band is highly correlated with other 1/3-octave band time series ranging from 2 octaves below to 2 octaves above the band's center frequency. Spatial coherence between hydrophones is also analyzed for hydrophone separations of 2.29, 2.56 and 4.84 km over a 10-month period. The noise field is highly coherent out to the maximum distance studied, 4.84 km. Additionally, fluctuations of each time series are analyzed to determine time scales of greatest variability. The 14-month data show clearly that variability occurs primarily over three time scales: 7-22 hours (shipping-related), 56-282 hours (2-12 days, weather-related) and over an 8-12 month period.
60

Análise do impacto de perturbações sobre medidas de qualidade de ajuste para modelos de equações estruturais / Analysis of the impact of disturbances over the measures of goodness of fit for structural equation models

Renata Trevisan Brunelli 11 May 2012 (has links)
A Modelagem de Equações Estruturais (SEM, do inglês Structural Equation Modeling) é uma metodologia multivariada que permite estudar relações de causa/efeito e correlação entre um conjunto de variáveis (podendo ser elas observadas ou latentes), simultaneamente. A técnica vem se difundindo cada vez mais nos últimos anos, em diferentes áreas do conhecimento. Uma de suas principais aplicações é na conrmação de modelos teóricos propostos pelo pesquisador (Análise Fatorial Conrmatória). Existem diversas medidas sugeridas pela literatura que servem para avaliar o quão bom está o ajuste de um modelo de SEM. Entretanto, é escassa a quantidade de trabalhos na literatura que listem relações entre os valores de diferentes medidas com possíveis problemas na amostra e na especicação do modelo, isto é, informações a respeito de que possíveis problemas desta natureza impactam quais medidas (e quais não), e de que maneira. Tal informação é importante porque permite entender os motivos pelos quais um modelo pode estar sendo considerado mal-ajustado. O objetivo deste trabalho é investigar como diferentes perturbações na amostragem, especicação e estimação de um modelo de SEM podem impactar as medidas de qualidade de ajuste; e, além disso, entender se o tamanho da amostra influencia esta resposta. Simultaneamente, também se avalia como tais perturbações afetam as estimativas, dado que há casos de perturbações em que os parâmetros continuam sendo bem ajustados, mesmo com algumas medidas indicando um mau ajuste; ao mesmo tempo, há ocasiões em que se indica um bom ajuste, enquanto que os parâmetros são estimados de forma distorcida. Tais investigações serão realizadas a partir de simulações de exemplos de amostras de diferentes tamanhos para cada tipo de perturbação. Então, diferentes especicações de modelos de SEM serão aplicados a estas amostras, e seus parâmetros serão estimados por dois métodos diferentes: Mínimos Quadrados Generalizados e Máxima Verossimilhança. Conhecendo tais resultados, um pesquisador que queira aplicar a técnica de SEM poderá se precaver e, dentre as medidas de qualidade de ajuste disponíveis, optar pelas que mais se adequem às características de seu estudo. / The Structural Equation Modeling (SEM) is a multivariate methodology that allows the study of cause-and-efect relationships and correlation of a set of variables (that may be observed or latent ones), simultaneously. The technique has become more diuse in the last years, in different fields of knowledge. One of its main applications is on the confirmation of theoretical models proposed by the researcher (Confirmatory Factorial Analysis). There are several measures suggested by literature to measure the goodness of t of a SEM model. However, there is a scarce number of texts that list relationships between the values of different of those measures with possible problems that may occur on the sample or the specication of the SEM model, like information concerning what problems of this nature impact which measures (and which not), and how does the impact occur. This information is important because it allows the understanding of the reasons why a model could be considered bad fitted. The objective of this work is to investigate how different disturbances of the sample, the model specification and the estimation of a SEM model are able to impact the measures of goodness of fit; additionally, to understand if the sample size has influence over this impact. It will also be investigated if those disturbances affect the estimates of the parameters, given the fact that there are disturbances for which occurrence some of the measures indicate badness of fit but the parameters are not affected; at the same time, that are occasions on which the measures indicate a good fit and there are disturbances on the estimates of the parameters. Those investigations will be made simulating examples of different size samples for which type of disturbance. Then, SEM models with different specifications will be fitted to each sample, and their parameters will be estimated by two dierent methods: Generalized Least Squares and Maximum Likelihood. Given those answers, a researcher that wants to apply the SEM methodology to his work will be able to be more careful and, among the available measures of goodness of fit, to chose those that are more adequate to the characteristics of his study.

Page generated in 0.0798 seconds