• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 80
  • 11
  • 8
  • 6
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 149
  • 149
  • 46
  • 45
  • 35
  • 29
  • 26
  • 22
  • 19
  • 18
  • 17
  • 17
  • 16
  • 16
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Utilizavimo proceso laiko eilučių modelis / Time series model for waste utilization

Michailova, Olga 30 June 2014 (has links)
Šiame darbe buvo atlikta gyvūninės kilmės atliekų utilizavimo proceso analizė. Pagrindinis uždavinys- rasti būdą prognozuoti utilizavimo proceso pabaigą ir tuo sumažinti energijos suvartojimą. Naudojausi laiko eilučių prognozavimo modeliu. Aprašiau savo metodą pasikeitimo taškui rasti. Taip pat buvo panaudota tiesinė regresija. Galimybė prognozuoti pasikeitimo tašką leistų žymiai sumažinti utilizavimo proceso savikainą. / I this work, an analysis of animal waste utilization process was performed. The main task was to find a way to predict the end of the desiccation process, because possibility to predict this end point may reduce energy consumption. I used the time series forcasting model and proposed method for the change point detection. Linear regression was also used for this task.
32

Multiple Change-Point Detection: A Selective Overview

Niu, Yue S., Hao, Ning, Zhang, Heping 11 1900 (has links)
Very long and noisy sequence data arise from biological sciences to social science including high throughput data in genomics and stock prices in econometrics. Often such data are collected in order to identify and understand shifts in trends, for example, from a bull market to a bear market in finance or from a normal number of chromosome copies to an excessive number of chromosome copies in genetics. Thus, identifying multiple change points in a long, possibly very long, sequence is an important problem. In this article, we review both classical and new multiple change-point detection strategies. Considering the long history and the extensive literature on the change-point detection, we provide an in-depth discussion on a normal mean change-point model from aspects of regression analysis, hypothesis testing, consistency and inference. In particular, we present a strategy to gather and aggregate local information for change-point detection that has become the cornerstone of several emerging methods because of its attractiveness in both computational and theoretical properties.
33

Statistická analýza historických časových řad / Statistical analysis of historical temperature series

Gergelits, Václav January 2013 (has links)
Title: Statistical analysis of historical temperature series Author: Václav Gergelits Department: Department of Probability and Mathematical Statistics Supervisor: prof. RNDr. Jaromír Antoch CSc. Supervisor's e-mail address: antoch@karlin.mff.cuni.cz Abstract: In the present work we deal with the statistical analysis of time-series of a mean-temperature obtained from seven European cities from the Europe Union project "IMPROVE". Properties of the time series are analyzed by means of descriptive statistics, being assessing their homoscedasticity, autocorrelation and normality. We report the ways in which the data has been adjusted, including consideration of the impact of the urban heat island and we discuss the availability of additional data. The theoretical part presents a theory of change point detection for a one change model as well as more than one change model taking an autocorrelation into account. In the practical part we analyze the data using change point detection method. The significant increase was not detected for time series of Cadiz and Uppsala. The significant increase was rather detected for the rest of the time series. The increase of temperature could be in a relation to the adjustment for the urban heat island. Keywords: change point detection, temperature time series 1
34

ESTIMATION IN PARTIALLY LINEAR MODELS WITH CORRELATED OBSERVATIONS AND CHANGE-POINT MODELS

Fan, Liangdong 01 January 2018 (has links)
Methods of estimating parametric and nonparametric components, as well as properties of the corresponding estimators, have been examined in partially linear models by Wahba [1987], Green et al. [1985], Engle et al. [1986], Speckman [1988], Hu et al. [2004], Charnigo et al. [2015] among others. These models are appealing due to their flexibility and wide range of practical applications including the electricity usage study by Engle et al. [1986], gum disease study by Speckman [1988], etc., wherea parametric component explains linear trends and a nonparametric part captures nonlinear relationships. The compound estimator (Charnigo et al. [2015]) has been used to estimate the nonparametric component of such a model with multiple covariates, in conjunction with linear mixed modeling for the parametric component. These authors showed, under a strict orthogonality condition, that parametric and nonparametric component estimators could achieve what appear to be (nearly) optimal rates, even in the presence of subject-specific random effects. We continue with research on partially linear models with subject-specific random intercepts. Inspired by Speckman [1988], we propose estimators of both parametric and nonparametric components of a partially linear model, where consistency is achievable under an orthogonality condition. We also examine a scenario without orthogonality to find that bias could still exist asymptotically. The random intercepts accommodate analysis of individuals on whom repeated measures are taken. We illustrate our estimators in a biomedical case study and assess their finite-sample performance in simulation studies. Jump points have often been found within the domain of nonparametric models (Muller [1992], Loader [1996] and Gijbels et al. [1999]), which may lead to a poor fit when falsely assuming the underlying mean response is continuous. We study a specific type of change-point where the underlying mean response is continuous on both left and right sides of the change-point. We identify the convergence rate of the estimator proposed in Liu [2017] and illustrate the result in simulation studies.
35

Monitoring portfolio weights by means of the Shewhart method

Mohammadian, Jeela January 2010 (has links)
<p>The distribution of asset returns may lead to structural breaks. Thesebreaks may result in changes of the optimal portfolio weights. For a port-folio investor, the ability of timely detection of any systematic changesin the optimal portfolio weights is of a great interest.In this master thesis work, the use of the Shewhart method, as amethod for detecting a sudden parameter change, the implied changein the multivariate portfolio weights and its performance is reviewed.</p><p> </p>
36

Estimation and bias correction of the magnitude of an abrupt level shift

Liu, Wenjie January 2012 (has links)
Consider a time series model which is stationary apart from a single shift in mean. If the time of a level shift is known, the least squares estimator of the magnitude of this level shift is a minimum variance unbiased estimator. If the time is unknown, however, this estimator is biased. Here, we first carry out extensive simulation studies to determine the relationship between the bias and three parameters of our time series model: the true magnitude of the level shift, the true time point and the autocorrelation of adjacent observations. Thereafter, we use two generalized additive models to generalize the simulation results. Finally, we examine to what extent the bias can be reduced by multiplying the least squares estimator with a shrinkage factor. Our results showed that the bias of the estimated magnitude of the level shift can be reduced when the level shift does not occur close to the beginning or end of the time series. However, it was not possible to simultaneously reduce the bias for all possible time points and magnitudes of the level shift.
37

Statistical model building and inference about the normalized site attenuation (NSA) measurements for electromagnetic interference (EMI)

Chiu, Shih-ting 09 August 2004 (has links)
Open site measurement on the electromagnetic interference is the most direct and universally accepted standard approach for measuring radiated emissions from an equipment or the radiation susceptibility of a component or equipment. A site is qualified for testing EMI or not is decided by the antenna measurements. In this work, we use data from setups with di erent factors to find relations of measurement and the situation of antenna. A one change point model has been used to fit observed measurements and compare the di erences with two kinds of antenna (broadband antenna and dipole antenna). However, with only one change point model it may not give a suitable fit for all data sets in this work. Therefore, we have tried other models and applied them to the data. Furthermore, we try to set up another standard more strict than ¡Ó4dB based on statistical inference results in deciding whether a site is a better one with more precision in measuring EMI values. Finally, a program by Matlab with a complete analysis based on the procedure performed here is provided, so that it may be used as a standard tool for evaluating whether a site is with good measurement quality in practice.
38

Efficient change detection methods for bio and healthcare surveillance

Han, Sung Won 14 June 2010 (has links)
For the last several decades, sequential change point problems have been studied in both the theoretical area (sequential analysis) and the application area (industrial SPC). In the conventional application, the baseline process is assumed to be stationary, and the shift pattern is a step function that is sustained after the shift. However, in biosurveillance, the underlying assumptions of problems are more complicated. This thesis investigates several issues in biosurveillance such as non-homogeneous populations, spatiotemporal surveillance methods, and correlated structures in regional data. The first part of the thesis discusses popular surveillance methods in sequential change point problems and off-line problems based on count data. For sequential change point problems, the CUSUM and the EWMA have been used in healthcare and public health surveillance to detect increases in the rates of diseases or symptoms. On the other hand, for off-line problems, scan statistics are widely used. In this chapter, we link the method for off-line problems to those for sequential change point problems. We investigate three methods--the CUSUM, the EWMA, and scan statistics--and compare them by conditional expected delay (CED). The second part of the thesis pertains to the on-line monitoring problem of detecting a change in the mean of Poisson count data with a non-homogeneous population size. The most common detection schemes are based on generalized likelihood ratio statistics, known as an optimal method under Lodern's criteria. We propose alternative detection schemes based on the weighted likelihood ratios and the adaptive threshold method, which perform better than generalized likelihood ratio statistics in an increasing population. The properties of these three detection schemes are investigated by both a theoretical approach and numerical simulation. The third part of the thesis investigates spatiotemporal surveillance based on likelihood ratios. This chapter proposes a general framework for spatiotemporal surveillance based on likelihood ratio statistics over time windows. We show that the CUSUM and other popular likelihood ratio statistics are the special cases under such a general framework. We compare the efficiency of these surveillance methods in spatiotemporal cases for detecting clusters of incidence using both Monte Carlo simulations and a real example. The fourth part proposes multivariate surveillance methods based on likelihood ratio tests in the presence of spatial correlations. By taking advantage of spatial correlations, the proposed methods can perform better than existing surveillance methods by providing the faster and more accurate detection. We illustrate the application of these methods with a breast cancer case in New Hampshire when observations are spatially correlated.
39

Contributions to variable selection for mean modeling and variance modeling in computer experiments

Adiga, Nagesh 17 January 2012 (has links)
This thesis consists of two parts. The first part reviews a Variable Search, a variable selection procedure for mean modeling. The second part deals with variance modeling for robust parameter design in computer experiments. In the first chapter of my thesis, Variable Search (VS) technique developed by Shainin (1988) is reviewed. VS has received quite a bit of attention from experimenters in industry. It uses the experimenters' knowledge about the process, in terms of good and bad settings and their importance. In this technique, a few experiments are conducted first at the best and worst settings of the variables to ascertain that they are indeed different from each other. Experiments are then conducted sequentially in two stages, namely swapping and capping, to determine the significance of variables, one at a time. Finally after all the significant variables have been identified, the model is fit and the best settings are determined. The VS technique has not been analyzed thoroughly. In this report, we analyze each stage of the method mathematically. Each stage is formulated as a hypothesis test, and its performance expressed in terms of the model parameters. The performance of the VS technique is expressed as a function of the performances in each stage. Based on this, it is possible to compare its performance with the traditional techniques. The second and third chapters of my thesis deal with variance modeling for robust parameter design in computer experiments. Computer experiments based on engineering models might be used to explore process behavior if physical experiments (e.g. fabrication of nanoparticles) are costly or time consuming. Robust parameter design (RPD) is a key technique to improve process repeatability. Absence of replicates in computer experiments (e.g. Space Filling Design (SFD)) is a challenge in locating RPD solution. Recently, there have been studies (e.g. Bates et al. (2005), Chen et al. (2006), Dellino et al. (2010 and 2011), Giovagnoli and Romano (2008)) of RPD issues on computer experiments. Transmitted variance model (TVM) proposed by Shoemaker and Tsui. (1993) for physical experiments can be applied in computer simulations. The approaches stated above rely heavily on the estimated mean model because they obtain expressions for variance directly from mean models or by using them for generating replicates. Variance modeling based on some kind of replicates relies on the estimated mean model to a lesser extent. To the best of our knowledge, there is no rigorous research on variance modeling needed for RPD in computer experiments. We develop procedures for identifying variance models. First, we explore procedures to decide groups of pseudo replicates for variance modeling. A formal variance change-point procedure is developed to rigorously determine the replicate groups. Next, variance model is identified and estimated through a three-step variable selection procedure. Properties of the proposed method are investigated under various conditions through analytical and empirical studies. In particular, impact of correlated response on the performance is discussed.
40

Monitoring portfolio weights by means of the Shewhart method

Mohammadian, Jeela January 2010 (has links)
The distribution of asset returns may lead to structural breaks. Thesebreaks may result in changes of the optimal portfolio weights. For a port-folio investor, the ability of timely detection of any systematic changesin the optimal portfolio weights is of a great interest.In this master thesis work, the use of the Shewhart method, as amethod for detecting a sudden parameter change, the implied changein the multivariate portfolio weights and its performance is reviewed.

Page generated in 0.0539 seconds