• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 2
  • 2
  • 2
  • Tagged with
  • 11
  • 11
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Learning Curve in Aortic Dissection Surgery with the Use of Cumulative Sum Analysis

SONG, MIN-HO 02 1900 (has links)
No description available.
2

Economically optimum design of cusum charts when there is a multiplicity of assignable causes

Hsu, Margaretha Mei-Ing 02 March 2010 (has links)
This study is concerned with the design of cumulative sum charts based on a minimum cost criterion when there are multiple assignable causes occurring randomly, but with known effect. A cost model is developed that relates the design parameters (i.e. sampling interval, decision limit, reference value and sample size) of a cusum chart and the cost and risk factors of the process to the long run average loss cost per hour for the process. Optimum designs for various sets of cost and risk factors are found by minimizing the long run average loss-cost per hour of the process with respect to the design parameters of a cusum chart. Optimization is accomplished by use of Brown's method. A modified Brownian motion approximation is used for calculating ARLs in the cost model. The nature of the loss-cost function is investigated numerically. The effects of changes in the design parameters and in the cost and risk factors are also studied. An investigation of the limiting behavior of the loss-cost function as the decision limit approaches infinity reveals that in some cases there exist some points that yield a lower loss-cost than that of the local minimum obtained by Brown's method. It is conjectured that if the model is extended to include more realistic assumption about the occurrence of assignable causes then only the local minimum solutions will remain. This paper also shows that the multiple assignable cause model can be well approximated by a matched single cause model. Then in practice it may be sufficient to find the optimum design for the matched. single cause model. / Ph. D.
3

Surveillance of Poisson and Multinomial Processes

Ryan, Anne Garrett 18 April 2011 (has links)
As time passes, change occurs. With this change comes the need for surveillance. One may be a technician on an assembly line and in need of a surveillance technique to monitor the number of defective components produced. On the other hand, one may be an administrator of a hospital in need of surveillance measures to monitor the number of patient falls in the hospital or to monitor surgical outcomes to detect changes in surgical failure rates. A natural choice for on-going surveillance is the control chart; however, the chart must be constructed in a way that accommodates the situation at hand. Two scenarios involving attribute control charting are investigated here. The first scenario involves Poisson count data where the area of opportunity changes. A modified exponentially weighted moving average (EWMA) chart is proposed to accommodate the varying sample sizes. The performance of this method is compared with the performance for several competing control chart techniques and recommendations are made regarding the best preforming control chart method. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). The second scenario involves monitoring a process where items are classified into more than two categories and the results for these classifications are readily available. A multinomial cumulative sum (CUSUM) chart is proposed to monitor these types of situations. The multinomial CUSUM chart is evaluated through comparisons of performance with competing control chart methods. This research is a result of joint work with Mr. Lee J. Wells (Grado Department of Industrial and Systems Engineering, Virginia Tech) and Dr. William H. Woodall (Department of Statistics, Virginia Tech). / Ph. D.
4

Automated Performance Analysis for Robotic Systems: Leveraging Statistical Analysis and Visualization Techniques

Pettersson, Elon January 2024 (has links)
Performance regression testing is a difficult task with several intricacies and complexities. In the absence of analysis tools, manual analysis must be conducted which is undoubtedly infeasible. Thereby, in this thesis, an automated performance analysis framework is proposed, aiming to mitigate the faced issues. To make this possible, the adequacy of the data needed to be established. Additionally, a fault detection algorithm had to be developed. From investigating the current state-of-the-art of performance anomaly detection, evidently, statistical models have been utilised far more than classical machine learning, and deep learning. Consequently, based on this knowledge and based on the types of anomalies present in the data, a cumulative sum based statistical method is proposed. The findings demonstrate that the data is adequate for detecting faults, and verifying their validity, as they are consistently observable in several test configurations. However, tests are not performed frequently enough which consequently leads to challenges in identifying the exact locations of faults. The algorithm was evaluated on artificial data with injected faults and could detect over 90 % of anomalies if they were prominent enough. Longer sequences before fault deviations occur, improved the ability of detecting the faults. Thus, further motivating the need to collect data more frequently. On a final note, the automated performance analysis framework successfully improved the efficiency of fault detection, and greater contextual data awareness were achieved through the visualization features. Manual analysis can however detect faults with greater accuracy. On that ground, these results should be interpreted with caution.
5

Use of Partial Cumulative Sum to Detect Trends and Change Periods in Time Series Analysis with Fuzzy Statistics

陳力揚 Unknown Date (has links)
轉折點與趨勢的研究在時間數列分析、經濟與財務領域裡一直是重要的研究主題。隨著所欲研究的物件之結構複雜性日益增加,再加上人類的知識語言因人類本身的主觀意識、不同時間、環境的變遷與研判事件的角度等條件下,可能使得所蒐集到的時間數列資料具某種程度的模糊性。為此,Zadeh[1965]提出了模糊理論,專門解決這一類的問題。在討論時間數列分析中的轉折點與趨勢問題時,常常會遇到時間數列的轉折過程緩慢且不明顯的情況。因此傳統的轉折點研究方法在這種情形下便顯得不足。對此,許多學者提出了轉折區間的概念。然而轉折區間的概念仍然存在一個潛在的困擾:在一個小的時間區間下,一個被認定的轉折區間可能在時間區間拉得很長的情況下,被視為是一個不重要的擾動或雜訊。本文嘗試藉由模糊統計量,提出一個轉折區間與趨勢的偵測方法。與其他轉折區間偵測法不同的是我們所提的方法能藉由控制參數,偵測到合乎使用者需求的轉折區間,進而找到一個趨勢的起點與終點。藉此避免把雜訊當成轉折區間或把轉折區間當成雜訊的困擾。因為使用了模糊統計量,同時也解決了資料的模糊性問題。 / Because the structural change of a time series from one pattern to another may not switch at once but rather experience a period of adjustment time, conventional change points detection may be inappropriate to apply under this circumstance. Furthermore, changes in time series often occur gradually so that there is a certain amount of fuzziness in the change point. For this, many research have focused on the theory of change periods detection for a better model to fit. However, a change period in some small observation time interval may seem a neglectable noise in a larger observation time interval. In this paper, we propose an approach to detect trends and change periods with fuzzy statistics through using partial cumulative sum. By controlling the parameters, we can filter the noises and find out suitable change periods. With the change periods, we can further find the trends in a time series. Finally, some simulated data and empirical examples are studied to test our approach. Simulation and empirical results show that the performance of our approach is satisfactorily successful.
6

Algoritmos não-paramétricos para detecção de pontos de mudança em séries temporais de alta frequência / Non-parametric change-point detection algorithms for high-frequency time series

Cardoso, Vitor Mendes 05 July 2018 (has links)
A área de estudos econométricos visando prever o comportamento dos mercados financeiros aparece cada vez mais como uma área de pesquisas dinâmica e abrangente. Dentro deste universo, podemos de maneira geral separar os modelos desenvolvidos em paramétricos e não paramétricos. O presente trabalho tem como objetivo investigar técnicas não-paramétricas derivadas do CUSUM, ferramenta gráfica que se utiliza do conceito de soma acumulada originalmente desenvolvida para controles de produção e de qualidade. As técnicas são utilizadas na modelagem de uma série cambial (USD/EUR) de alta frequência com diversos pontos de negociação dentro de um mesmo dia / The area of econometric studies to predict the behavior of financial markets increasingly proves itself as a dynamic and comprehensive research area. Within this universe, we can generally separate the models developed in parametric and non-parametric. The present work aims to investigate non-parametric techniques derived from CUSUM, a graphical tool that uses the cumulative sum concept originally developed for production and quality controls. The techniques are used in the modeling of a high frequency exchange series (USD/EUR) with several trading points within the same day
7

Algoritmos não-paramétricos para detecção de pontos de mudança em séries temporais de alta frequência / Non-parametric change-point detection algorithms for high-frequency time series

Vitor Mendes Cardoso 05 July 2018 (has links)
A área de estudos econométricos visando prever o comportamento dos mercados financeiros aparece cada vez mais como uma área de pesquisas dinâmica e abrangente. Dentro deste universo, podemos de maneira geral separar os modelos desenvolvidos em paramétricos e não paramétricos. O presente trabalho tem como objetivo investigar técnicas não-paramétricas derivadas do CUSUM, ferramenta gráfica que se utiliza do conceito de soma acumulada originalmente desenvolvida para controles de produção e de qualidade. As técnicas são utilizadas na modelagem de uma série cambial (USD/EUR) de alta frequência com diversos pontos de negociação dentro de um mesmo dia / The area of econometric studies to predict the behavior of financial markets increasingly proves itself as a dynamic and comprehensive research area. Within this universe, we can generally separate the models developed in parametric and non-parametric. The present work aims to investigate non-parametric techniques derived from CUSUM, a graphical tool that uses the cumulative sum concept originally developed for production and quality controls. The techniques are used in the modeling of a high frequency exchange series (USD/EUR) with several trading points within the same day
8

CUSUM tests based on grouped observations

Eger, Karl-Heinz, Tsoy, Evgeni Borisovich 08 November 2009 (has links) (PDF)
This paper deals with CUSUM tests based on grouped or classified observations. The computation of average run length is reduced to that of solving of a system of simultaneous linear equations. Moreover a corresponding approximation based on the Wald approximations for characteristics of sequential likelihood ratio tests is presented. The effect of grouping is investigated with a CUSUM test for the mean of a normal distribution based on F-optimal grouping schemes. The considered example demonstrates that hight efficient CUSUM tests can be obtained for F-optimal grouping schemes already with a small number of groups.
9

CUSUM tests based on grouped observations

Eger, Karl-Heinz, Tsoy, Evgeni Borisovich 08 November 2009 (has links)
This paper deals with CUSUM tests based on grouped or classified observations. The computation of average run length is reduced to that of solving of a system of simultaneous linear equations. Moreover a corresponding approximation based on the Wald approximations for characteristics of sequential likelihood ratio tests is presented. The effect of grouping is investigated with a CUSUM test for the mean of a normal distribution based on F-optimal grouping schemes. The considered example demonstrates that hight efficient CUSUM tests can be obtained for F-optimal grouping schemes already with a small number of groups.
10

Sequential and non-sequential hypertemporal classification and change detection of Modis time-series

Grobler, Trienko Lups 10 June 2013 (has links)
Satellites provide humanity with data to infer properties of the earth that were impossible a century ago. Humanity can now easily monitor the amount of ice found on the polar caps, the size of forests and deserts, the earth’s atmosphere, the seasonal variation on land and in the oceans and the surface temperature of the earth. In this thesis, new hypertemporal techniques are proposed for the settlement detection problem in South Africa. The hypertemporal techniques are applied to study areas in the Gauteng and Limpopo provinces of South Africa. To be more specific, new sequential (windowless) and non-sequential hypertemporal techniques are implemented. The time-series employed by the new hypertemporal techniques are obtained from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor, which is on board the earth observations satellites Aqua and Terra. One MODIS dataset is constructed for each province. A Support Vector Machine (SVM) [1] that uses a novel noise-harmonic feature set is implemented to detect existing human settlements. The noise-harmonic feature set is a non-sequential hypertemporal feature set and is constructed by using the Coloured Simple Harmonic Oscillator (CSHO) [2]. The CSHO consists of a Simple Harmonic Oscillator (SHO) [3], which is superimposed on the Ornstein-Uhlenbeck process [4]. The noise-harmonic feature set is an extension of the classic harmonic feature set [5]. The classic harmonic feature set consists of a mean and a seasonal component. For the case studies in this thesis, it is observed that the noise-harmonic feature set not only extends the harmonic feature set, but also improves on its classification capability. The Cumulative Sum (CUSUM) algorithm was developed by Page in 1954 [6]. In its original form it is a sequential (windowless) hypertemporal change detection technique. Windowed versions of the algorithm have been applied in a remote sensing context. In this thesis CUSUM is used in its original form to detect settlement expansion in South Africa and is benchmarked against the classic band differencing change detection approach of Lunetta et al., which was developed in 2006 [7]. In the case of the Gauteng study area, the CUSUM algorithm outperformed the band differencing technique. The exact opposite behaviour was seen in the case of the Limpopo dataset. Sequential hypertemporal techniques are data-intensive and an inductive MODIS simulator was therefore also developed (to augment datasets). The proposed simulator is also based on the CSHO. Two case studies showed that the proposed inductive simulator accurately replicates the temporal dynamics and spectral dependencies found in MODIS data. / Thesis (PhD(Eng))--University of Pretoria, 2012. / Electrical, Electronic and Computer Engineering / unrestricted

Page generated in 0.0714 seconds