• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 9
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 110
  • 32
  • 22
  • 20
  • 20
  • 17
  • 16
  • 16
  • 15
  • 15
  • 14
  • 13
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

GLR Control Charts for Monitoring a Proportion

Huang, Wandi 19 December 2011 (has links)
The generalized likelihood ratio (GLR) control charts are studied for monitoring a process proportion of defective or nonconforming items. The type of process change considered is an abrupt sustained increase in the process proportion, which implies deterioration of the process quality. The objective is to effectively detect a wide range of shift sizes. For the first part of this research, we assume samples are collected using rational subgrouping with sample size n>1, and the binomial GLR statistic is constructed based on a moving window of past sample statistics that follow a binomial distribution. Steady state performance is evaluated for the binomial GLR chart and the other widely used binomial charts. We find that in terms of the overall performance, the binomial GLR chart is at least as good as the other charts. In addition, since it has only two charting parameters that both can be easily obtained based on the approach we propose, less effort is required to design the binomial GLR chart for practical applications. The second part of this research develops a Bernoulli GLR chart to monitor processes based on the continuous inspection, in which case samples of size n=1 are observed. A constant upper bound is imposed on the estimate of the process shift, preventing the corresponding Bernoulli GLR statistic from being undefined. Performance comparisons between the Bernoulli GLR chart and the other charts show that the Bernoulli GLR chart has better overall performance than its competitors, especially for detecting small shifts. / Ph. D.
92

Stopping Times Related to Trading Strategies

Abramov, Vilen 25 April 2008 (has links)
No description available.
93

A strategy for the synthesis of real-time statistical process control within the framework of a knowledge based controller

Crowe, Edward R. January 1995 (has links)
No description available.
94

Prospective Spatio-Temporal Surveillance Methods for the Detection of Disease Clusters

Marshall, J. Brooke 11 December 2009 (has links)
In epidemiology it is often useful to monitor disease occurrences prospectively to determine the location and time when clusters of disease are forming. This aids in the prevention of illness and injury of the public and is the reason spatio-temporal disease surveillance methods are implemented. Care must be taken in the design and implementation of these types of surveillance methods so that the methods provide accurate information on the development of clusters. Here two spatio-temporal methods for prospective disease surveillance are considered. These include the local Knox monitoring method and a new wavelet-based prospective monitoring method. The local Knox surveillance method uses a cumulative sum (CUSUM) control chart for monitoring the local Knox statistic, which tests for space-time clustering each time there is an incoming observation. The detection of clusters of events occurring close together both temporally and spatially is important in finding outbreaks of disease within a specified geographic region. The local Knox surveillance method is based on the Knox statistic, which is often used in epidemiology to test for space-time clustering retrospectively. In this method, a local Knox statistic is developed for use with the CUSUM chart for prospective monitoring so that epidemics can be detected more quickly. The design of the CUSUM chart used in this method is considered by determining the in-control average run length (ARL) performance for different space and time closeness thresholds as well as for different control limit values. The effect of nonuniform population density and region shape on the in-control ARL is explained and some issues that should be considered when implementing this method are also discussed. In the wavelet-based prospective monitoring method, a surface of incidence counts is modeled over time in the geographical region of interest. This surface is modeled using Poisson regression where the regressors are wavelet functions from the Haar wavelet basis. The surface is estimated each time new incidence data is obtained using both past and current observations, weighing current observations more heavily. The flexibility of this method allows for the detection of changes in the incidence surface, increases in the overall mean incidence count, and clusters of disease occurrences within individual areas of the region, through the use of control charts. This method is also able to incorporate information on population size and other covariates as they change in the geographical region over time. The control charts developed for use in this method are evaluated based on their in-control and out-of-control ARL performance and recommendations on the most appropriate control chart to use for different monitoring scenarios is provided. / Ph. D.
95

Surveillance of Negative Binomial and Bernoulli Processes

Szarka, John Louis III 03 May 2011 (has links)
The evaluation of discrete processes are performed for industrial and healthcare processes. Count data may be used to measure the number of defective items in industrial applications or the incidence of a certain disease at a health facility. Another classification of a discrete random variable is for binary data, where information on an item can be classified as conforming or nonconforming in a manufacturing context, or a patient's status of having a disease in health-related applications. The first phase of this research uses discrete count data modeled from the Poisson and negative binomial distributions in a healthcare setting. Syndromic counts are currently monitored by the BioSense program within the Centers for Disease Control and Prevention (CDC) to provide real-time biosurveillance. The Early Aberration Reporting System (EARS) uses recent baseline information comparatively with a current day's syndromic count to determine if outbreaks may be present. An adaptive threshold method is proposed based on fitting baseline data to a parametric distribution, then calculating an upper-tailed p-value. These statistics are then converted to an approximately standard normal random variable. Monitoring is examined for independent and identically distributed data as well as data following several seasonal patterns. An exponentially weighted moving average (EWMA) chart is also used for these methods. The effectiveness of these methods in detecting simulated outbreaks in several sensitivity analyses is evaluated. The second phase of research explored in this dissertation considers information that can be classified as a binary event. In industry, it is desirable to have the probability of a nonconforming item, p, be extremely small. Traditional Shewhart charts such as the p-chart, are not reliable for monitoring this type of process. A comprehensive literature review of control chart procedures for this type of process is given. The equivalence between two cumulative sum (CUSUM) charts, based on geometric and Bernoulli random variables is explored. An evaluation of the unit and group--runs (UGR) chart is performed, where it is shown that the in--control behavior of this chart is quite misleading and should not be recommended for practitioners. / Ph. D.
96

Quickest spectrum sensing with multiple antennas: performance analysis in various fading channels.

Hanafi, Effariza binti January 2014 (has links)
Traditional wireless networks are regulated by a fixed spectrum assignment policy. This results in situations where most of the allocated radio spectrum is not utilized. In order to address this spectrum underutilization, cognitive radio (CR) has emerged as a promising solution. Spectrum sensing is an essential component in CR networks to discover spectrum opportunities. The most common spectrum sensing techniques are energy detection, matched filtering or cyclostationary feature detection, which aim to maximize the probability of detection subject to a certain false alarm rate. Besides probability of detection, detection delay is also a crucial criterion in spectrum sensing. In an interweave CR network, quick detection of the absence of primary user (PU), which is the owner of the licensed spectrum, allows good utilization of unused spectrum, while quick detection of PU transmission is important to avoid any harmful interference. This thesis consider quickest spectrum sensing, where the aim is to detect the PU with minimal detection delay subject to a certain false alarm rate. In the earlier chapters of this thesis, a single antenna cognitive user (CU) is considered and we study quickest spectrum sensing performance in Gaussian channel and classical fading channel models, including Rayleigh, Rician, Nakagami-m and a long-tailed channel. We prove that the power of the complex received signal is a sufficient statistic and derive the probability density function (pdf) of the received signal amplitude for all of the fading cases. The novel derivation of the pdfs of the amplitude of the received signal for the Rayleigh, Rician and Nakagami-m channels uses an approach which avoids numerical integration. We also consider the event of a mis-matched channel, where the cumulative sum (CUSUM) detector is designed for a specific channel, but a different channel is experienced. This scenario could occur in CR network as the channel may not be known and hence the CUSUM detector may be experiencing a different channel. Simulations results illustrate that the average detection delay depends greatly on the channel but very little on the nature of the detector. Hence, the simplest time-invariant detector can be employed with minimal performance loss. Theoretical expressions for the distribution of detection delay for the time-invariant CUSUM detector, with single antenna CU are developed. These are useful for a more detailed analysis of the quickest spectrum sensing performance. We present several techniques to approximate the distribution of detection delay, including deriving a novel closed-form expression for the detection delay distribution when the received signal experiences a Gaussian channel. We also derive novel approximations for the distribution of detection delay for the general case due to the absence of a general framework. Most of the techniques are general and can be applied to any independent and identically distributed (i.i.d) channel. Results show that different signal-to-noise ratio (SNR) and detection delay conditions require different methods in order to achieve good approximations of the detection delay distributions. The remarkably simple Brownian motion approach gives the best approximation for longer detection delays. In addition, results show that the type of fading channel has very little impact on long detection delays. In later chapters of this thesis, we employ multiple receive antennas at the CU. In particular, we study the performance of multi-antenna quickest spectrum sensing when the received signal experiences Gaussian, independent and correlated Rayleigh and Rician channels. The pdfs of the received signals required to form the CUSUM detector are derived for each of the scenarios. The extension into multiple antennas allows us to gain some insight into the reduction in detection delay that multiple antennas can provide. Results show that the sensing performance increases with an increasing Rician K-factor. In addition, channel correlation has little impact on the sensing performance at high SNR, whereas at low SNR, increasing correlation between channels improves the quickest spectrum sensing performance. We also consider mis-matched channel conditions and show that the quickest spectrum sensing performance at a particular correlation coefficient or Rician K-factor depends heavily on the true channel irrespective of the number of antennas at the CU and is relatively insensitive to the channel used to design the CUSUM detector. Hence, a simple multi-antenna time-invariant detector can be employed. Based on the results obtained in the earlier chapters, we derive theoretical expressions for the detection delay distribution when multiple receive antennas are employed at the CU. In particular, the approximation of the detection delay distribution is based on the Brownian motion approach.
97

Différents procédés statistiques pour détecter la non-stationnarité dans les séries de précipitation

Charette, Kevin 04 1900 (has links)
Ce mémoire a pour objectif de déterminer si les précipitations convectives estivales simulées par le modèle régional canadien du climat (MRCC) sont stationnaires ou non à travers le temps. Pour répondre à cette question, nous proposons une méthodologie statistique de type fréquentiste et une de type bayésien. Pour l'approche fréquentiste, nous avons utilisé le contrôle de qualité standard ainsi que le CUSUM afin de déterminer si la moyenne a augmenté à travers les années. Pour l'approche bayésienne, nous avons comparé la distribution a posteriori des précipitations dans le temps. Pour ce faire, nous avons modélisé la densité \emph{a posteriori} d'une période donnée et nous l'avons comparée à la densité a posteriori d'une autre période plus éloignée dans le temps. Pour faire la comparaison, nous avons utilisé une statistique basée sur la distance d'Hellinger, la J-divergence ainsi que la norme L2. Au cours de ce mémoire, nous avons utilisé l'ARL (longueur moyenne de la séquence) pour calibrer et pour comparer chacun de nos outils. Une grande partie de ce mémoire sera donc dédiée à l'étude de l'ARL. Une fois nos outils bien calibrés, nous avons utilisé les simulations pour les comparer. Finalement, nous avons analysé les données du MRCC pour déterminer si elles sont stationnaires ou non. / The main goal of this master's thesis is to find whether the summer convective precipitations simulated by the Canadian Regional Climate Model (CRCM) are stationary over time or not. In order to answer that question, we propose both a frequentist and Bayesian statistical methodology. For the frequentist approach, we used standard quality control and the CUSUM to determine if the mean has increased over the years. For the Bayesian approach, we compared the posterior distributions of the precipitations over time. In order to do the comparison, we used a statistic based on the Hellinger's distance, the J-divergence and the L2 norm. In this master's thesis, we used the ARL (average run length) to calibrate each of our methods. Therefore, a big part of this thesis is about studying the actual property of the ARL. Once our tools are well calibrated, we used the simulation to compare them together. Finally, we studied the data from the CRCM to decide, whether or not, the data are stationary.
98

A unified approach to structural change tests based on F statistics, OLS residuals, and ML scores

Zeileis, Achim January 2005 (has links) (PDF)
Three classes of structural change tests (or tests for parameter instability) which have been receiving much attention in both the statistics and econometrics communities but have been developed in rather loosely connected lines of research are unified by embedding them into the framework of generalized M-fluctuation tests (Zeileis and Hornik, 2003). These classes are tests based on F statistics (supF, aveF, expF tests), on OLS residuals (OLS-based CUSUM and MOSUM tests) and on maximum likelihood scores (including the Nyblom-Hansen test). We show that (represantives from) these classes are special cases of the generalized M-fluctuation tests, based on the same functional central limit theorem, but employing different functionals for capturing excessive fluctuations. After embedding these tests into the same framework and thus understanding the relationship between these procedures for testing in historical samples, it is shown how the tests can also be extended to a monitoring situation. This is achieved by establishing a general M-fluctuation monitoring procedure and then applying the different functionals corresponding to monitoring with F statistics, OLS residuals and ML scores. In particular, an extension of the supF test to a monitoring scenario is suggested and illustrated on a real-world data set. / Series: Research Report Series / Department of Statistics and Mathematics
99

Analysis of Taiwan Stock Exchange high frequency transaction data

Hao Hsu, Chia- 06 July 2012 (has links)
Taiwan Security Market is a typical order-driven market. The electronic trading system of Taiwan Security Market launched in 1998 significantly reduces the trade matching time (the current matching time is around 20 seconds) and promptly provides updated online trading information to traders. In this study, we establish an online transaction simulation system which can be applied to predict trade prices and study market efficiency. Models are established for the times and volumes of the newly added bid/ask orders on the match list. Exponentially weighted moving average (EWMA) method is adopted to update the model parameters. Match prices are predicted dynamically based on the EWMA updated models. Further, high frequency bid/ask order data are used to find the supply and demand curves as well as the equilibrium prices. Differences between the transaction prices and the equilibrium prices are used to investigate the efficiency of Taiwan Security Market. Finally, EWMA and cusum control charts are used to monitor the market efficiency. In empirical study, we analyze the intra-daily (April, 2005) high frequency match data of Uni-president Enterprises Corporation and Formosa Plastics Corporation.
100

Différents procédés statistiques pour détecter la non-stationnarité dans les séries de précipitation

Charette, Kevin 04 1900 (has links)
Ce mémoire a pour objectif de déterminer si les précipitations convectives estivales simulées par le modèle régional canadien du climat (MRCC) sont stationnaires ou non à travers le temps. Pour répondre à cette question, nous proposons une méthodologie statistique de type fréquentiste et une de type bayésien. Pour l'approche fréquentiste, nous avons utilisé le contrôle de qualité standard ainsi que le CUSUM afin de déterminer si la moyenne a augmenté à travers les années. Pour l'approche bayésienne, nous avons comparé la distribution a posteriori des précipitations dans le temps. Pour ce faire, nous avons modélisé la densité \emph{a posteriori} d'une période donnée et nous l'avons comparée à la densité a posteriori d'une autre période plus éloignée dans le temps. Pour faire la comparaison, nous avons utilisé une statistique basée sur la distance d'Hellinger, la J-divergence ainsi que la norme L2. Au cours de ce mémoire, nous avons utilisé l'ARL (longueur moyenne de la séquence) pour calibrer et pour comparer chacun de nos outils. Une grande partie de ce mémoire sera donc dédiée à l'étude de l'ARL. Une fois nos outils bien calibrés, nous avons utilisé les simulations pour les comparer. Finalement, nous avons analysé les données du MRCC pour déterminer si elles sont stationnaires ou non. / The main goal of this master's thesis is to find whether the summer convective precipitations simulated by the Canadian Regional Climate Model (CRCM) are stationary over time or not. In order to answer that question, we propose both a frequentist and Bayesian statistical methodology. For the frequentist approach, we used standard quality control and the CUSUM to determine if the mean has increased over the years. For the Bayesian approach, we compared the posterior distributions of the precipitations over time. In order to do the comparison, we used a statistic based on the Hellinger's distance, the J-divergence and the L2 norm. In this master's thesis, we used the ARL (average run length) to calibrate each of our methods. Therefore, a big part of this thesis is about studying the actual property of the ARL. Once our tools are well calibrated, we used the simulation to compare them together. Finally, we studied the data from the CRCM to decide, whether or not, the data are stationary.

Page generated in 0.0346 seconds