• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The use of temporally aggregated data on detecting a structural change of a time series process

Lee, Bu Hyoung January 2016 (has links)
A time series process can be influenced by an interruptive event which starts at a certain time point and so a structural break in either mean or variance may occur before and after the event time. However, the traditional statistical tests of two independent samples, such as the t-test for a mean difference and the F-test for a variance difference, cannot be directly used for detecting the structural breaks because it is almost certainly impossible that two random samples exist in a time series. As alternative methods, the likelihood ratio (LR) test for a mean change and the cumulative sum (CUSUM) of squares test for a variance change have been widely employed in literature. Another point of interest is temporal aggregation in a time series. Most published time series data are temporally aggregated from the original observations of a small time unit to the cumulative records of a large time unit. However, it is known that temporal aggregation has substantial effects on process properties because it transforms a high frequency nonaggregate process into a low frequency aggregate process. In this research, we investigate the effects of temporal aggregation on the LR test and the CUSUM test, through the ARIMA model transformation. First, we derive the proper transformation of ARIMA model orders and parameters when a time series is temporally aggregated. For the LR test for a mean change, its test statistic is associated with model parameters and errors. The parameters and errors in the statistic should be changed when an AR(p) process transforms upon the mth order temporal aggregation to an ARMA(P,Q) process. Using the property, we propose a modified LR test when a time series is aggregated. Through Monte Carlo simulations and empirical examples, we show that the aggregation leads the null distribution of the modified LR test statistic being shifted to the left. Hence, the test power increases as the order of aggregation increases. For the CUSUM test for a variance change, we show that two aggregation terms will appear in the test statistic and have negative effects on test results when an ARIMA(p,d,q) process transforms upon the mth order temporal aggregation to an ARIMA(P,d,Q) process. Then, we propose a modified CUSUM test to control the terms which are interpreted as the aggregation effects. Through Monte Carlo simulations and empirical examples, the modified CUSUM test shows better performance and higher test powers to detect a variance change in an aggregated time series than the original CUSUM test. / Statistics
2

Variance Change Point Detection under A Smoothly-changing Mean Trend with Application to Liver Procurement

Gao, Zhenguo 23 February 2018 (has links)
Literature on change point analysis mostly requires a sudden change in the data distribution, either in a few parameters or the distribution as a whole. We are interested in the scenario that the variance of data may make a significant jump while the mean of data changes in a smooth fashion. It is motivated by a liver procurement experiment with organ surface temperature monitoring. Blindly applying the existing change point analysis methods to the example can yield erratic change point estimates since the smoothly-changing mean violates the sudden-change assumption. In my dissertation, we propose a penalized weighted least squares approach with an iterative estimation procedure that naturally integrates variance change point detection and smooth mean function estimation. Given the variance components, the mean function is estimated by smoothing splines as the minimizer of the penalized weighted least squares. Given the mean function, we propose a likelihood ratio test statistic for identifying the variance change point. The null distribution of the test statistic is derived together with the rates of convergence of all the parameter estimates. Simulations show excellent performance of the proposed method. Application analysis offers numerical support to the non-invasive organ viability assessment by surface temperature monitoring. The method above can only yield the variance change point of temperature at a single point on the surface of the organ at a time. In practice, an organ is often transplanted as a whole or in part. Therefore, it is generally of more interest to study the variance change point for a chunk of organ. With this motivation, we extend our method to study variance change point for a chunk of the organ surface. Now the variances become functions on a 2D space of locations (longitude and latitude) and the mean is a function on a 3D space of location and time. We model the variance functions by thin-plate splines and the mean function by the tensor product of thin-plate splines and cubic splines. However, the additional dimensions in these functions incur serious computational problems since the sample size, as a product of the number of locations and the number of sampling time points, becomes too large to run the standard multi-dimensional spline models. To overcome the computational hurdle, we introduce a multi-stages subsampling strategy into our modified iterative algorithm. The strategy involves several down-sampling or subsampling steps educated by preliminary statistical measures. We carry out extensive simulations to show that the new method can efficiently cut down the computational cost and make a practically unsolvable problem solvable with reasonable time and satisfactory parameter estimates. Application of the new method to the liver surface temperature monitoring data shows its effectiveness in providing accurate status change information for a portion of or the whole organ. / Ph. D. / The viability evaluation is the key issue in the organ transplant operation. The donated organ must be viable at the time of being transplanted to the recipient. Nowadays, viability evaluation can be assessed by analyzing the temperature data monitored on the organ surface. In my dissertation, I have developed two new statistical methods to evaluate the viability status of a prepared organ by studying the organ surface temperature. The first method I have developed can be used to detect the change of viability status at a spot on the organ surface. The second method I have developed can be used to detect the change of viability condition for the selected organ chunks. In practice, combining these two methods together can provide accurate viability status change information for a portion of or the whole organ effectively.

Page generated in 0.3619 seconds