• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 9
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 110
  • 32
  • 22
  • 20
  • 20
  • 17
  • 16
  • 16
  • 15
  • 15
  • 14
  • 13
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

New control charts for monitoring univariate autocorrelated processes and high-dimensional profiles

Lee, Joongsup 18 August 2011 (has links)
In this thesis, we first investigate the use of automated variance estimators in distribution-free statistical process control (SPC) charts for univariate autocorrelated processes. We introduce two variance estimators---the standardized time series overlapping area estimator and the so-called quick-and-dirty autoregressive estimator---that can be obtained from a training data set and used effectively with distribution-free SPC charts when those charts are applied to processes exhibiting nonnormal responses or correlation between successive responses. In particular, we incorporate the two estimators into DFTC-VE, a new distribution-free tabular CUSUM chart developed for autocorrelated processes; and we compare its performance with other state-of-the-art distribution-free SPC charts. Using either of the two variance estimators, the DFTC-VE outperforms its competitors in terms of both in-control and out-of-control average run lengths when all the competing procedures are tested on the same set of independently sampled realizations of selected autocorrelated processes with normal or nonnormal noise components. Next, we develop WDFTC, a wavelet-based distribution-free CUSUM chart for detecting shifts in the mean of a high-dimensional profile with noisy components that may exhibit nonnormality, variance heterogeneity, or correlation between profile components. A profile describes the relationship between a selected quality characteristic and an input (design) variable over the experimental region. Exploiting a discrete wavelet transform (DWT) of the mean in-control profile, WDFTC selects a reduced-dimension vector of the associated DWT components from which the mean in-control profile can be approximated with minimal weighted relative reconstruction error. Based on randomly sampled Phase I (in-control) profiles, the covariance matrix of the corresponding reduced-dimension DWT vectors is estimated using a matrix-regularization method; then the DWT vectors are aggregated (batched) so that the nonoverlapping batch means of the reduced-dimension DWT vectors have manageable covariances. To monitor shifts in the mean profile during Phase II operation, WDFTC computes a Hotelling's T-square--type statistic from successive nonoverlapping batch means and applies a CUSUM procedure to those statistics, where the associated control limits are evaluated analytically from the Phase I data. We compare WDFTC with other state-of-the-art profile-monitoring charts using both normal and nonnormal noise components having homogeneous or heterogenous variances as well as independent or correlated components; and we show that WDFTC performs well, especially for local shifts of small to medium size, in terms of both in-control and out-of-control average run lengths.
62

Détection Statistique de Rupture de Modèle dans les Systèmes Dynamiques - Application à la Supervision de Procédés de Dépollution Biologique

Verdier, Ghislain 30 November 2007 (has links) (PDF)
Cette thèse considère le problème de la détection de rupture de modèle dans des systèmes dynamiques complexes. L'objectif est de mettre au point des méthodes statistiques capables de détecter le plus rapidement possible un changement de paramètre dans le modèle décrivant le système, tout en gardant un faible taux de fausses alarmes. Ce type de méthode s'applique à la détection d'anomalie ou de défaillance sur de nombreux systèmes (système de navigation, contrôle de qualité...).<br />Les méthodes développées ici prennent en compte les caractéristiques des procédés de dépollution biologique, qui constituent l'application principale de ce travail. Ainsi, la mise au point d'une procédure, de type CUSUM, construite à partir des estimations des vraisemblances conditionnelles permet de traiter, d'une part, le cas où une partie du modèle est inconnue en utilisant une approche non paramétrique pour estimer cette partie, et d'autre part, le cas fréquemment rencontré en pratique où le système est observé indirectement. Pour ce deuxième cas, des approches de type filtrage particulaire sont utilisées.<br />Des résultats d'optimalité sont établies pour les approches proposées. Ces approches sont ensuite appliquées à un problème réel, un bioréacteur de retraitement des eaux usées.
63

Sensor Validation Using Linear Parametric Models, Artificial Neural Networks and CUSUM / Sensorvalidering medelst linjära konfektionsmodeller, artificiella neurala nätverk och CUSUM

Norman, Gustaf January 2015 (has links)
Siemens gas turbines are monitored and controlled by a large number of sensors and actuators. Process information is stored in a database and used for offline calculations and analyses. Before storing the sensor readings, a compression algorithm checks the signal and skips the values that explain no significant change. Compression of 90 % is not unusual. Since data from the database is used for analyses and decisions are made upon results from these analyses it is important to have a system for validating the data in the database. Decisions made on false information can result in large economic losses. When this project was initiated no sensor validation system was available. In this thesis the uncertainties in measurement chains are revealed. Methods for fault detection are investigated and finally the most promising methods are put to the test. Linear relationships between redundant sensors are derived and the residuals form an influence structure allowing the faulty sensor to be isolated. Where redundant sensors are not available, a gas turbine model is utilized to state the input-output relationships so that estimates of the sensor outputs can be formed. Linear parametric models and an ANN (Artificial Neural Network) are developed to produce the estimates. Two techniques for the linear parametric models are evaluated; prediction and simulation. The residuals are also evaluated in two ways; direct evaluation against a threshold and evaluation with the CUSUM (CUmulative SUM) algorithm. The results show that sensor validation using compressed data is feasible. Faults as small as 1% of the measuring range can be detected in many cases.
64

Diagnosis of a Truck Engine using Nolinear Filtering Techniques

Nilsson, Fredrik January 2007 (has links)
Scania CV AB is a large manufacturer of heavy duty trucks that, with an increasingly stricter emission legislation, have a rising demand for an effective On Board Diagnosis (OBD) system. One idea for improving the OBD system is to employ a model for the construction of an observer based diagnosis system. The proposal in this report is, because of a nonlinear model, to use a nonlinear filtering method for improving the needed state estimates. Two nonlinear filters are tested, the Particle Filter (PF) and the Extended Kalman Filter (EKF). The primary objective is to evaluate the use of the PF for Fault Detection and Isolation (FDI), and to compare the result against the use of the EKF. With the information provided by the PF and the EKF, two residual based diagnosis systems and two likelihood based diagnosis systems are created. The results with the PF and the EKF are evaluated for both types of systems using real measurement data. It is shown that the four systems give approximately equal results for FDI with the exception that using the PF is more computational demanding than using the EKF. There are however some indications that the PF, due to the nonlinearities, could offer more if enough CPU time is available.
65

Causes and effects of U.S. military expenditures (time-series models and applications) /

Chung, Sam-man, January 1996 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 1996. / Typescript. Vita. Includes bibliographical references (leaves 438-450). Also available on the Internet.
66

Causes and effects of U.S. military expenditures (time-series models and applications)

Chung, Sam-man, January 1996 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 1996. / Typescript. Vita. Includes bibliographical references (leaves 438-450). Also available on the Internet.
67

Statistical Monitoring and Control of Locally Proactive Routing Protocols in MANETs

January 2012 (has links)
abstract: Mobile ad hoc networks (MANETs) have attracted attention for mission critical applications. This dissertation investigates techniques of statistical monitoring and control for overhead reduction in a proactive MANET routing protocol. Proactive protocols transmit overhead periodically. Instead, we propose that the local conditions of a node should determine this transmission decision. While the goal is to minimize overhead, a balance in the amount of overhead transmitted and the performance achieved is required. Statistical monitoring consists of techniques to determine if a characteristic has shifted away from an in-control state. A basic tool for monitoring is a control chart, a time-oriented representation of the characteristic. When a sample deviates outside control limits, a significant change has occurred and corrective actions are required to return to the in-control state. We investigate the use of statistical monitoring of local conditions in the Optimized Link State Routing (OLSR) protocol. Three versions are developed. In A-OLSR, each node uses a Shewhart chart to monitor betweenness of its two-hop neighbourhood. Betweenness is a social network metric that measures a node's influence; betweenness is larger when a node has more influence. Changes in topology are associated with changes in betweenness. We incorporate additional local node conditions including speed, density, packet arrival rate, and number of flows it forwards in A+-OLSR. Response Surface Methodology (RSM) is used to optimize timer values. As well, the Shewhart chart is replaced by an Exponentially Weighted Moving Average (EWMA) chart, which is more sensitive to small changes in the characteristic. It is known that control charts do not work as well in the presence of correlation. Hence, in A*-OLSR the autocorrelation in the time series is removed and an Auto-Regressive Integrated Moving Average (ARIMA) model found; this removes the dependence on node speed. A*-OLSR also extends monitoring to two characteristics concurrently using multivariate cumulative sum (MCUSUM) charts. The protocols are evaluated in simulation, and compared to OLSR and its variants. The techniques for statistical monitoring and control are general and have great potential to be applied to the adaptive control of many network protocols. / Dissertation/Thesis / Ph.D. Computer Science 2012
68

Detection of abnormal situations and energy efficiency control in Heating Ventilation and Air Conditioning (HVAC) systems

Sklavounos, Dimitris C. January 2015 (has links)
This research is related to the control of energy consumption and efficiency in building Heating Ventilation and Air Conditioning (HVAC) systems and is primarily concerned with controlling the function of heating. The main goal of this thesis is to develop a control system that can achieve the following two main control functions: a) detection of unexpected indoor conditions that may result in unnecessary power consumption and b) energy efficiency control regarding optimal balancing of two parameters: the required energy consumption for heating, versus thermal comfort of the occupants. Methods of both orientations were developed in a multi-zone space composed of nine zones where each zone is equipped with a wireless node consisting of temperature and occupancy sensors while all the scattered nodes together form a wireless sensor network (WSN). The main methods of both control functions utilize the potential of the deterministic subspace identification (SID) predictive model which provides the predicted temperature of the zones. In the main method for detecting unexpected situations that can directly affect the thermal condition of the indoor space and cause energy consumption (abnormal situations), the predictive temperature from the SID model is compared with the real temperature and thus possible temperature deviations that indicate unexpected situations are detected. The method successfully detects two situations: the high infiltration gain due to unexpected cold air intake from the external surroundings through potential unforeseen openings (windows, exterior doors, opened ceilings etc) as well as the high heat gain due to onset of fire. With the support of the statistical algorithm for abrupt change detection, Cumulative Sum (CUSUM), the detection of temperature deviations is accomplished with accuracy in a very short time. The CUSUM algorithm is first evaluated at an initial approach to detect power diversions due to the above situations caused by the aforementioned exogenous factors. The predicted temperature of the zone from the SID model utilized appropriately also by the main method of the second control function for energy efficiency control. The time needed for the temperature of a zone to reach the thermal comfort zone threshold from a low initial value is measured by the predicted temperature evolution, and this measurement bases the logic of a control criterion for applying proactive heating to the unoccupied zones or not. Additional key points for the control criterion of the method is the occupation time of the zones as well as the remaining time of the occupants in the occupied zones. Two scenarios are examined: the first scenario with two adjacent zones where the one is occupied and the other is not, and the second scenario with a multi-zone space where the occupants are moving through the zones in a cascade mode. Gama and Pareto probability distributions modeled the occupation times of the two-zone scenario while exponential distribution modeled the cascade scenario as the least favorable case. The mobility of the occupants modeled with a semi-Markov process and the method provides satisfactory and reasonable results. At an initial approach the proactive heating of the zones is evaluated with specific algorithms that handle appropriately the occupation time into the zones.
69

strucchange: An R Package for Testing for Structural Change in Linear Regression Models

Kleiber, Christian, Hornik, Kurt, Leisch, Friedrich, Zeileis, Achim 01 1900 (has links) (PDF)
This paper reviews tests for structural change in linear regression models from the generalized fluctuation test framework as well as from the F test (Chow test) framework. It introduces a unified approach for implementing these tests and presents how these ideas have been realized in an R package called strucchange. Enhancing the standard significance test approach the package contains methods to fit, plot and test empirical fluctuation processes (like CUSUM, MOSUM and estimates-based processes) and to compute, plot and test sequences of F statistics with the supF, aveF and expF test. Thus, it makes powerful tools available to display information about structural changes in regression relationships and to assess their significance. Furthermore, it is described how incoming data can be monitored.
70

The application of frequency domain methods to two statistical problems

Potgieter, Gert Diedericks Johannes 10 September 2012 (has links)
D.Phil. / We propose solutions to two statistical problems using the frequency domain approach to time series analysis. In both problems the data at hand can be described by the well known signal plus noise model. The first problem addressed is the estimation of the underlying variance of a process for the use in a Shewhart or CUSUM control chart when the mean of the process may be changing. We propose an estimator for the underlying variance based on the periodogram of the observed data. Such estimators have properties which make them superior to some estimators currently used in Statistical Quality Control. We also present a CUSUM chart for monitoring the variance which is based upon the periodogram-based estimator for the variance. The second problem, stimulated by a specific problem in Variable Star Astronomy, is to test whether or not the mean of a bivariate time series is constant over the span of observations. We consider two periodogram-based tests for constancy of the mean, derive their asymptotic distributions under the null hypothesis and under local alternatives and show how consistent estimators for the unknown parameters in the proposed model can be found

Page generated in 0.0157 seconds