• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 51
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 86
  • 86
  • 24
  • 14
  • 13
  • 11
  • 11
  • 11
  • 10
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Noise Characteristics And Edge-Enhancing Denoisers For The Magnitude Mri Imagery

Alwehebi, Aisha A 01 May 2010 (has links)
Most of PDE-based restoration models and their numerical realizations show a common drawback: loss of fine structures. In particular, they often introduce an unnecessary numerical dissipation on regions where the image content changes rapidly such as on edges and textures. This thesis studies the magnitude data/imagery of magnetic resonance imaging (MRI) which follows Rician distribution. It analyzes statistically that the noise in the magnitude MRI data is approximately Gaussian of mean zero and of the same variance as in the frequency-domain measurements. Based on the analysis, we introduce a novel partial differential equation (PDE)-based denoising model which can restore fine structures satisfactorily and simultaneously sharpen edges as needed. For an efficient simulation we adopt an incomplete Crank-Nicolson (CN) time-stepping procedure along with the alternating direction implicit (ADI) method. The algorithm is analyzed for stability. It has been numerically verified that the new model can reduce the noise satisfactorily, outperforming the conventional PDE-based restoration models in 3-4 alternating direction iterations, with the residual (the difference between the original image and the restored image) being nearly edgeree. It has also been verified that the model can perform edge-enhancement effectively during the denoising of the magnitude MRI imagery. Numerical examples are provided to support the claim.
72

Information Approach to Change Point Analysis and its Application to Fiscally Standardized Cities

Hadamuscin, Larry A. 12 August 2022 (has links)
No description available.
73

Statistical Models for Count Data from Multiple Sclerosis Clinical Trials and their Applications

Rettiganti, Mallikarjuna Rao 17 December 2010 (has links)
No description available.
74

Bayesian Analysis of Temporal and Spatio-temporal Multivariate Environmental Data

El Khouly, Mohamed Ibrahim 09 May 2019 (has links)
High dimensional space-time datasets are available nowadays in various aspects of life such as economy, agriculture, health, environment, etc. Meanwhile, it is challenging to reveal possible connections between climate change and weather extreme events such as hurricanes or tornadoes. In particular, the relationship between tornado occurrence and climate change has remained elusive. Moreover, modeling multivariate spatio-temporal data is computationally expensive. There is great need to computationally feasible models that account for temporal, spatial, and inter-variables dependence. Our research focuses on those areas in two ways. First, we investigate connections between changes in tornado risk and the increase in atmospheric instability over Oklahoma. Second, we propose two multiscale spatio-temporal models, one for multivariate Gaussian data, and the other for matrix-variate Gaussian data. Those frameworks are novel additions to the existing literature on Bayesian multiscale models. In addition, we have proposed parallelizable MCMC algorithms to sample from the posterior distributions of the model parameters with enhanced computations. / Doctor of Philosophy / Over 1000 tornadoes are reported every year in the United States causing massive losses in lives and possessions according to the National Oceanic and Atmospheric Administration. Therefore, it is worthy to investigate possible connections between climate change and tornado occurrence. However, there are massive environmental datasets in three or four dimensions (2 or 3 dimensional space, and time), and the relationship between tornado occurrence and climate change has remained elusive. Moreover, it is computationally expensive to analyze those high dimensional space-time datasets. In part of our research, we have found a significant relationship between occurrence of strong tornadoes over Oklahoma and meteorological variables. Some of those meteorological variables have been affected by ozone depletion and emissions of greenhouse gases. Additionally, we propose two Bayesian frameworks to analyze multivariate space-time datasets with fast and feasible computations. Finally, our analyses indicate different patterns of temperatures at atmospheric altitudes with distinctive rates over the United States.
75

Ensemble Kalman filtering for hydraulic conductivity characterization: Parallelization and non-Gaussianity

Xu, Teng 03 November 2014 (has links)
Tesis por compendio / The ensemble Kalman filter (EnKF) is nowadays recognized as an excellent inverse method for hydraulic conductivity characterization using transient piezometric head data. and it is proved that the EnKF is computationally efficient and capable of handling large fields compared to other inverse methods. However, it is needed a large ensemble size (Chen and Zhang, 2006) to get a high quality estimation, which means a lots of computation time. Parallel computing is an efficient alterative method to reduce the commutation time. Besides, although the EnKF is good accounting for the non linearities of the state equation, it fails when dealing with non-Gaussian distribution fields. Recently, many methods are developed trying to adapt the EnKF to non-Gaussian distributions(detailed in the History and present state chapter). Zhou et al. (2011, 2012) have proposed a Normal-Score Ensemble Kalman Filter (NS-EnKF) to character the non-Gaussian distributed conductivity fields, and already showed that transient piezometric head was enough for hydraulic conductivity characterization if a training image for the hydraulic conductivity was available. Then in this work, we will show that, when without such a training image but with enough transient piezometric head information, the performance of the updated ensemble of realizations in the characterization of the non-Gaussian reference field. In the end, we will introduce a new method for parameterizing geostatistical models coupling with the NS-EnKF in the characterization of a Heterogenous non-Gaussian hydraulic conductivity field. So, this doctor thesis is mainly including three parts, and the name of the parts as below. 1, Parallelized Ensemble Kalman Filter for Hydraulic Conductivity Characterization. 2, The Power of Transient Piezometric Head Data in Inverse Modeling: An Application of the Localized Normal-score EnKF with Covariance Inflation in a Heterogenous Bimodal Hydraulic Conductivity Field. 3, Parameterizing geostatistical models coupling with the NS-EnKF for Heterogenous Bimodal Hydraulic Conductivity characterization. / Xu, T. (2014). Ensemble Kalman filtering for hydraulic conductivity characterization: Parallelization and non-Gaussianity [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/43769 / Compendio
76

IG-GARJI模型下之住宅抵押貸款保險評價 / Valuation of Mortgage Insurance Contracts in IG-GARJI model

林思岑, Lin, Szu Tsen Unknown Date (has links)
住宅抵押貸款保險(Mortgage Insurance)為管理違約風險的重要工具,在2008年次級房貸風暴後更加受到金融機構的關注。為了能更準確且更有效率的預測房價及合理評價住宅抵押貸款保險,本文延續Christoffersen, Heston and Jacobs (2006)對股票報酬率的研究,提出新的GARCH模型,利用Inverse Gaussian分配取代常態分配來捕捉房價序列中存在的自我相關以及典型現象(stylized facts),並且同時考慮房價市場中所隱含的價格跳躍現象。本文將新模型命名為IG-GARJI模型,以便和傳統GARCH模型作區分。由於傳統的GARCH模型在計算保險價格時,通常不存在封閉解,必須藉由模擬的方法來計算價格,會增加預測的誤差,本文提供IG-GARJI模型半封閉解以增進預測效率與準確度,並利用Bühlmann et al. (1996)提出的Esscher transform方法找出其風險中立機率測度,而後運用Heston and Nandi (2000)提出之遞迴方法,找出適合的住宅抵押貸款保險評價模型。實證結果顯示,在新建房屋市場中,使用Inverse Gaussian分配會比常態分配的表現要好;對於非新建房屋,不同模型間沒有顯著的差異。另外,本文亦引用Bardhan, Karapandža, and Urošević (2006)的觀點,利用不同評價模型來比較若房屋所有權無法及時轉換時,對住宅抵押貸款保險價格帶來的影響,為住宅抵押貸款保險提供更準確的評價方法。 / Mortgage insurance products represent an attractive alternative for managing default risk. After the subprime crisis in 2008, more and more financial institutions have paid highly attention on the credit risk and default risk in mortgage market. For the purpose of giving a more accurate and more efficient model in forecasting the house price and evaluate mortgage insurance contracts properly, we follow Christoffersen, Heston and Jacobs (2006) approach to propose a new GARCH model with Inverse Gaussian innovation instead of normal distribution which is capable of capturing the auto-correlated characteristic as well as the stylized facts revealed in house price series. In addition, we consider the jump risk within the model, which is widely discussed in the house market. In order to separate our new model from traditional GARCH model, we named our model IG-GARJI model. Generally, traditional GARCH model do not exist an analytical solution, it may increase the prediction error with respect to the simulation procedure for evaluating mortgage insurance. We propose a semi-analytical solution of our model to enhance the efficiency and accuracy. Furthermore, our approach is implemented the Esscher transform introduced by Bühlmann et al. (1996) to identify a martingale measure. Then use the recursive procedure proposed by Heston and Nandi (2000) to evaluate the mortgage insurance contract. The empirical results indicate that the model with Inverse Gaussian distribution gives better performance than the model with normal distribution in newly-built house market and we could not find any significant difference between each model in previously occupied house market. Moreover, we follow Bardhan, Karapandža, and Urošević (2006) approach to investigate the impact on the mortgage insurance premium due to the legal efficiency. Our model gives another alternative to value the mortgage contracts.
77

A Study of Gamma Distributions and Some Related Works

Chou, Chao-Wei 11 May 2004 (has links)
Characterization of distributions has been an important topic in statistical theory for decades. Although there have been many well known results already developed, it is still of great interest to find new characterizations of commonly used distributions in application, such as normal or gamma distribution. In practice, sometimes we make guesses on the distribution to be fitted to the data observed, sometimes we use the characteristic properties of those distributions to do so. In this paper we will restrict our attention to the characterizations of gamma distribution as well as some related studies on the corresponding parameter estimation based on the characterization properties. Some simulation studies are also given.
78

Srovnání znalostí z teorie elektromagnetického pole u laiků a odborníků v rámci civilní nouzové připravenosti / The comparison of knowledge of electromagnetic field theory for laymen and experts within the civil emergency preparedness

VESELÁ, Barbora January 2016 (has links)
The thesis "Comparison of knowledge of electromagnetic field theory of the laity and experts in the context of civil emergency preparedness" to put three goals: 1. The formation of the structure of an electromagnetic field for experts. 2. The reaching of the comparison of knowledge among experts and laymen. 3. Statistical processing of the results. The author has set the following hypotheses: H1. Theoretical distribution of knowledge in a sample of the general public will have a normal distribution . H2. Theoretical distribution of knowledge in a sample of professional community will not have a normal distribution. H3. The comparison of knowledge among the experts and the laymen will lead to an alternative hypothesis. The thesis was based on the knowledge of the theory curricular process. On the basis of this theory was made up not only the structure of the electromagnetic field, but also the questionnaire. An important step in this thesis was the creating a model structure of electromagnetic field . The structure was based on an analysis of the scientific system - the system of educational programs in the field of civil protection.The same structure was applied to the general public. An important step was to compare the knowledge of protect the population from experts and laymen. This issue has not been investigated in detail and it did not compare the knowledge of laymen and experts in the studied physics. The idea came from the possibility of extraordinarily events where respondents can meet with electromagnetic fields and will need the relevant theoretical knowledge. The aim was to the statistical evaluate of the applied questionnaires. There were applied nonparametric and parametric testing as the verification methods. The theoretical division of knowledge of experts is supposed Poisson distribution, on the contrary, the theoretical division of the general public should have a normal distribution. There was also compared the difference between knowledge of laymen and professionals. The using of the statistical methods have been received and confirmed the hypothesis and the thesis goals were fulfilled.
79

Foreign Exchange Option Valuation under Stochastic Volatility

Rafiou, AS January 2009 (has links)
>Magister Scientiae - MSc / The case of pricing options under constant volatility has been common practise for decades. Yet market data proves that the volatility is a stochastic phenomenon, this is evident in longer duration instruments in which the volatility of underlying asset is dynamic and unpredictable. The methods of valuing options under stochastic volatility that have been extensively published focus mainly on stock markets and on options written on a single reference asset. This work probes the effect of valuing European call option written on a basket of currencies, under constant volatility and under stochastic volatility models. We apply a family of the stochastic models to investigate the relative performance of option prices. For the valuation of option under constant volatility, we derive a closed form analytic solution which relaxes some of the assumptions in the Black-Scholes model. The problem of two-dimensional random diffusion of exchange rates and volatilities is treated with present value scheme, mean reversion and non-mean reversion stochastic volatility models. A multi-factor Gaussian distribution function is applied on lognormal asset dynamics sampled from a normal distribution which we generate by the Box-Muller method and make inter dependent by Cholesky factor matrix decomposition. Furthermore, a Monte Carlo simulation method is adopted to approximate a general form of numeric solution The historic data considered dates from 31 December 1997 to 30 June 2008. The basket contains ZAR as base currency, USD, GBP, EUR and JPY are foreign currencies.
80

Unsupervised Change Detection Using Multi-Temporal SAR Data : A Case Study of Arctic Sea Ice / Oövervakad förändringsdetektion med multitemporell SAR data : En fallstudie över arktisk havsis

Fröjse, Linda January 2014 (has links)
The extent of Arctic sea ice has decreased over the years and the importance of sea ice monitoring is expected to increase. Remote sensing change detection compares images acquired over the same geographic area at different times in order to identify changes that might have occurred in the area of interest. Change detection methods have been developed for cryospheric topics. The Kittler-Illingworth thresholding algorithm has proven to be an effective change detection tool, but has not been used for sea ice. Here it is applied to Arctic sea ice data. The objective is to investigate the unsupervised detection of changes in Arctic sea ice using multi-temporal SAR images. The well-known Kittler-Illingworth algorithm is tested using two density function models, i.e., the generalized Gaussian and the log-normal model. The difference image is obtained using the modified ratio operator. The histogram of the change image, which approximates its probability distribution, is considered to be a combination of two classes, i.e., the changed and unchanged classes. Histogram fitting techniques are used to estimate the unknown density functions and the prior probabilities. The optimum threshold is selected using a criterion function directly related to classification error. In this thesis three datasets were used covering parts of the Beaufort Sea from the years 1992, 2002, 2007 and 2009. The SAR and ASAR C-band data came from satellites ERS and ENVISAT respectively. All three were interpreted visually. For all three datasets, the generalized Gaussian detected a lot of change, whereas the log-normal detected less. Only one small subset of a dataset was validated against reference data. The log-normal distribution then obtained 0% false alarm rate through all trials. The generalized Gaussian obtained false alarm rates around 4% for most of the trials. The generalized Gaussian achieved detection accuracies around 95%, whereas the log-normal achieved detection accuracies around 70%. The overall accuracies for the generalized Gaussian were about 95% in most trials. The log-normal achieved overall accuracies at around 85%. The KHAT for the generalized Gaussian was in the range of 0.66-0.93. The KHAT for log-normal was in the range of 0.68-0.77. Using one additional speckle filter iteration increased the accuracy for the log-normal distribution. Generally, the detection of positive change has been accomplished with higher level of accuracy compared with negative change detection. A visual inspection shows that the generalized Gaussian distribution probably over-estimates the change. The log-normal distribution consistently detects less change than the generalized Gaussian. Lack of validation data made validation of the results difficult. The performed validation might not be reliable since the available validation data was only SAR imagery and differentiating change and no-change is difficult in the area. Further due to the lack of reference data it could not be decided, with certainty, which distribution performed the best. / Ytan av arktisk havsis har minskat genom åren och vikten av havsisövervakning förväntas öka. Förändrigsdetection jämför bilder från samma geografiska område från olika tidpunkter föra att identifiera förändringar som kan ha skett i intresseområdet. Förändringsdekteringsmetoder har utvecklats för kryosfäriska ämnen. Tröskelvärdesbestämning med Kittler-Illingworth algoritmen har visats sig vara ett effektivt verktyg för förändringsdetektion, men har inte änvänts på havsis. Här appliceras algoritmen på arktisk havsis. Målet är att undersökra oövervakad förändringsdetektion i arktisk havsis med multitemporella SAR bilder. Den välkända Kittler-Illingworth algoritmen testas med två täthetsfunktioner, nämligen generaliserad normaldistribution och log-normal distributionen. Differensbilden erhålls genom den modifierad ratio-operator. Histogrammet från förändringsbilden skattar dess täthetsfunktion, vilken anses vara en kombination av två klasser, förändring- och ickeförändringsklasser. Histogrampassningstekniker används för att uppskatta de okända täthetsfunktionerna och a priori sannolikheterna. Det optimala tröskelvärdet väljs genom en kriterionfunktion som är direkt relaterad till klassifikationsfel. I detta examensarbete användes tre dataset som täcker delar av Beaufort-havet från åren 1992, 2002, 2007 och 2009. SAR C-band data kom från satelliten ERS och ASAR C-band data kom från satelliten ENVISAT. Alla tre tolkades visuellt och för alla tre detekterade generaliserad normaldistribution mycket mer förändring än lognormal distributionen. Bara en mindre del av ett dataset validerades mot referensdata. Lognormal distributionen erhöll då 0% falska alarm i alla försök. Generalised normaldistributionen erhöll runt 4% falska alarm i de flesta försöken. Generaliserad normaldistributionen nådde detekteringsnoggrannhet runt 95% medan lognormal distributionen nådde runt 70%. Generell noggrannheten för generaliserad normaldistributionen var runt 95% i flesta försöken. För lognormal distributionen nåddes en generell noggrannhet runt 85%. KHAT koefficienten för generaliserad normaldistributionen var i intervallet 0.66-0.93. För lognormal distributionen var den i intervallet 0.68-0.77. Med en extra speckle-filtrering ökades nogranneheten för lognormal distributionen. Generellt sett, detekterades positiv förändring med högre nivå av noggrannhet än negativ förändring. Visuell inspektion visar att generaliserad normaldistribution troligen överskattar förändringen. Lognormal distributionen detekterar konsistent mindre förändring än generaliserad normaldistributionen. Bristen på referensdata gjorde valideringen av resultaten svårt. Den utförda valideringen är kanske inte så trovärdig, eftersom den tillgänliga referensdatan var bara SAR bilder och att särskilja förändring och ickeförändring är svårt i området. Vidare, på grund av bristen på referensdata, kunde det inte bestämmas med säkerhet vilken distribution som var bäst.

Page generated in 0.0836 seconds