Spelling suggestions: "subject:"false alarm"" "subject:"valse alarm""
11 |
Monitoramento de processo seis sigma por gráficos de controle de Shewhart / Monitoring of six sigma process by Shewhart control chartsMarques, Caio Augusto Nunes 02 August 2013 (has links)
Made available in DSpace on 2015-03-26T13:32:20Z (GMT). No. of bitstreams: 1
texto completo.pdf: 1927442 bytes, checksum: 4d51dbf78a2cc4c2f8a631ebde5dc6fe (MD5)
Previous issue date: 2013-08-02 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Developed at Motorola in 1987 the Six Sigma methodology seeks, by reducing the variability of key-processes, obtain critical to quality characteristics (CTQs) with defect probabilities close to zero. It has a Six Sigma process when the distance between the CTQ s target value (VN) and its nearest specification limit, is equal or greater than six standards-deviations (σ). In practice, despite the big attention being paid to the process, the average of the CTQ s probabilities distribution is able to shift until 1,5σ from the target value which even so, the process will be considered Six Sigma. So there is an interval between 4,5 and 6σ in which the process can vary without losing the quality level considered as world class . Thus, in this study, aimed establishes recommendations for planning the Shewhart control charts ̅ and R for monitoring Six Sigma processes. To do so, it was established a reference performance in which it was assumed the joint probability of false alarm equal to or less than 0.01; and the joint probability of true alarm growing according the reduction of the process Sigma level, from 0 in 6σ processes to 0.10 in those 5σ, reaching 0.90 at 4.5σ processes until reaches the unit for 3σ processes and inferior. Accordingly, it were investigated plannings with combinations between n = 2, 3, 4 and 5 and k = 2.5, 2.6, 2.7, 2.8, 2.9 and 3.0. It was identified that the pair of graphs in question performed well when the process was only under the effect of average displacement and lost performance occurred the increase of the variation as the only disturbance present or when the two anomalies were acting. It was possibly identify that the average displacement is the most observed problem, the simultaneous occurrence of both anomalies is less frequent and exclusive presence of increased variation is rare. Therefore, it was recommended that planning with n = 5 and k = 2.9 for monitoring Six Sigma Practical processes (ie, with sigma level between 4.5 and 6σ), which performed well only when the process was mainly under the effect of the average displacement. However, it is expected a good performance of this planning when the process is mainly under the effect of the average displacement. Thus, it is likely that the processes quality level falls without any signal from the control charts in question to indicate quality loss due to the increase of the variation, with or without the presence of the average displacement. / Desenvolvida em 1987 na Motorola, a metodologia Seis Sigma busca, mediante redução na variabilidade dos processos-chave, obter características críticas para a qualidade (CTQs) com probabilidades de defeitos próximas de zero. Tem-se um processo Seis Sigma quando a distância entre o valor-alvo (VN) da CTQ e o limite de especificação mais próximo for igual ou superior a seis desvios-padrão (σ). Na prática, por maior que seja a atenção dispensada ao processo, a média da distribuição de probabilidades da CTQ pode deslocar em até 1,5σ do valor-alvo, que ainda assim o processo será considerado Seis Sigma. Então existe um intervalo de 4,5 a 6σ, no qual o processo pode variar sem que perca o nível de qualidade considerado de classe mundial . Desta forma, neste trabalho, buscou-se estabelecer recomendações para o planejamento de gráficos de controle de Shewhart ̅ e R para o monitoramento de processos Seis Sigma. Para tanto, estabeleceu-se um desempenho de referência no qual se admitiu a probabilidade do alarme falso conjunto igual ou inferior a 0,01; e a probabilidade do alarme verdadeiro conjunto crescendo de acordo com a redução do nível Sigma do processo, passando de 0 em processos 6σ para 0,10 naqueles 5σ, atingindo 0,90 em processos 4,5σ até atingir a unidade para processos 3σ e inferiores. Nesse sentido, investigou-se planejamentos com combinações entre n = 2, 3, 4 e 5 e k = 2,5, 2,6, 2,7, 2,8, 2,9 e 3,0. Identificou-se que o par de gráficos em questão apresentou bom desempenho quando o processo esteve sob efeito somente do deslocamento da média e perdeu desempenho à medida que ocorreu o aumento da variação como única perturbação ou quando as duas anomalias estiveram atuando. Foi possível identificar que o deslocamento da média é o problema mais observado, a ocorrência simultânea das duas anomalias é menos frequente e a presença exclusiva do aumento da variação é rara. Logo, recomendou-se o planejamento com n = 5 e k = 2,9, para o monitoramento de processos Seis Sigma Práticos (isto é, com nível sigma entre 4,5 e 6σ), que apresentou bom desempenho apenas quando o processo esteve principalmente sob efeito do deslocamento da média. Portanto, é provável que o nível de qualidade dos processos caia sem que os gráficos de controle em questão sinalizem a perda da qualidade em função do aumento da variação, com ou sem a presença do deslocamento da média.
|
12 |
Detection and Tracking of Human Targets using Ultra-Wideband RadarÖstman, Andreas January 2016 (has links)
The purpose of this thesis was to assess the plausibility of using two Ultra- Wideband radars for detecting and tracking human targets. The detection has been performed by two different types of methods, constant false-alarm rate methods and a type of CLEAN algorithm. For tracking the targets, multiple hypothesis tracking has been studied. Particle filtering has been used for the state prediction, considering a significant amount of uncertainty in a motion model used in this thesis project. The detection and tracking methods have been implemented in MATLAB. Tracking in the cases of a single target and multiple targets has been investigated in simulation and experiment. The simulation results in these cases were compared with accurate ground truth data obtained using a VICON optical tracking system. The detection methods showed poor performance when using data that had been collected by the two radars and post-processed to enhance target features. For single targets, the detections were accurate enough to continuously track a target moving randomly in a controlled area. In the multiple target cases the tracker was not able to distinguish the multiple moving subjects.
|
13 |
On Development and Performance Evaluation of Some Biosurveillance MethodsZheng, Hongzhang 09 August 2011 (has links)
This study examines three applications of control charts used for monitoring syndromic data with different characteristics. The first part develops a seasonal autoregressive integrated moving average (SARIMA) based surveillance chart, and compares it with the CDC Early Aberration Reporting System (EARS) W2c method using both authentic and simulated data. After successfully removing the long-term trend and the seasonality involved in syndromic data, the performance of the SARIMA approach is shown to be better than the performance of the EARS method in terms of two key surveillance characteristics, the false alarm rate and the average time to detect the outbreaks.
In the second part, we propose a generalized likelihood ratio (GLR) control chart to detect a wide range of shifts in the mean of Poisson distributed biosurveillance data. The application of a sign function on the original GLR chart statistics leads to downward-sided, upward-sided, and two-sided GLR chart statistics in an unified framework. To facilitate the use of such charts in practice, we provide detailed guidance on developing and implementing the GLR chart. Under the steady-state framework, this study indicates that the overall GLR chart performance in detecting a range of shifts of interest is superior to the performance of traditional control charts including the EARS method, Shewhart charts, EWMA charts, and CUSUM charts.
There is often an excessive number of zeros involved in health care related data. Zero-inflated Poisson (ZIP) models are more appropriate than Poisson models to describe such data. The last part of the dissertation considers the GLR chart for ZIP data under a research framework similar to the second part. Because small sample sizes may influence the estimation of ZIP parameters, the efficiency of MLEs is investigated in depth, followed by suggestions for improvement. Numerical approaches to solving for the MLEs are discussed as well. Statistics for a set of GLR charts are derived, followed by modifications changing them from two-sided statistics to one-sided statistics. Although not a complete study of GLR charts for ZIP processes, due to limited time and resources, suggestions for future work are proposed at the end of this dissertation. / Ph. D.
|
14 |
Dynamic Probability Control Limits for Risk-Adjusted Bernoulli Cumulative Sum ChartsZhang, Xiang 12 December 2015 (has links)
The risk-adjusted Bernoulli cumulative sum (CUSUM) chart developed by Steiner et al. (2000) is an increasingly popular tool for monitoring clinical and surgical performance. In practice, however, use of a fixed control limit for the chart leads to quite variable in-control average run length (ARL) performance for patient populations with different risk score distributions. To overcome this problem, the simulation-based dynamic probability control limits (DPCLs) patient-by-patient for the risk-adjusted Bernoulli CUSUM charts is determined in this study. By maintaining the probability of a false alarm at a constant level conditional on no false alarm for previous observations, the risk-adjusted CUSUM charts with DPCLs have consistent in-control performance at the desired level with approximately geometrically distributed run lengths. Simulation results demonstrate that the proposed method does not rely on any information or assumptions about the patients' risk distributions. The use of DPCLs for risk-adjusted Bernoulli CUSUM charts allows each chart to be designed for the corresponding particular sequence of patients for a surgeon or hospital. The effect of estimation error on performance of risk-adjusted Bernoulli CUSUM chart with DPCLs is also examined. Our simulation results show that the in-control performance of risk-adjusted Bernoulli CUSUM chart with DPCLs is affected by the estimation error. The most influential factors are the specified desired in-control average run length, the Phase I sample size and the overall adverse event rate. However, the effect of estimation error is uniformly smaller for the risk-adjusted Bernoulli CUSUM chart with DPCLs than for the corresponding chart with a constant control limit under various realistic scenarios. In addition, there is a substantial reduction in the standard deviation of the in-control run length when DPCLs are used. Therefore, use of DPCLs has yet another advantage when designing a risk-adjusted Bernoulli CUSUM chart. These researches are results of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). Moreover, DPCLs are adapted to design the risk-adjusted CUSUM charts for multiresponses developed by Tang et al. (2015). It is shown that the in-control performance of the charts with DPCLs can be controlled for different patient populations because these limits are determined for each specific sequence of patients. Thus, the risk-adjusted CUSUM chart for multiresponses with DPCLs is more practical and should be applied to effectively monitor surgical performance by hospitals and healthcare practitioners. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech) and Mr. Justin Loda (Department of Statistics, Virginia Tech). / Ph. D.
|
15 |
UHF-SAR and LIDAR Complementary Sensor Fusion for Unexploded Buried Munitions DetectionDepoy, Randy S., Jr. January 2012 (has links)
No description available.
|
16 |
Employing Multiple Kernel Support Vector Machines for Counterfeit Banknote RecognitionSu, Wen-pin 29 July 2008 (has links)
Finding an efficient method to detect counterfeit banknotes is imperative. In this study, we propose multiple kernel weighted support vector machine for counterfeit banknote recognition. A variation of SVM in optimizing false alarm rate, called FARSVM, is proposed which provide minimized false negative rate and false positive rate. Each banknote is divided into m ¡Ñ n partitions, and each partition comes with its own kernels. The optimal weight with each kernel matrix in the combination is obtained through the semidefinite programming (SDP) learning method. The amount of time and space required by the original SDP is very demanding. We focus on this framework and adopt two strategies to reduce the time and space requirements. The first strategy is to assume the non-negativity of kernel weights, and the second strategy is to set the sum of weights equal to 1. Experimental results show that regions with zero kernel weights are easy to imitate with today¡¦s digital imaging technology, and regions with nonzero kernel weights are difficult to imitate. In addition, these results show that the proposed approach outperforms single kernel SVM and standard SVM with SDP on Taiwanese banknotes.
|
17 |
CONSTANT FALSE ALARM RATE PERFORMANCE OF SOUND SOURCE DETECTION WITH TIME DELAY OF ARRIVAL ALGORITHMWang, Xipeng 01 January 2017 (has links)
Time Delay of Arrival (TDOA) based algorithms and Steered Response Power (SRP) based algorithms are two most commonly used methods for sound source detection and localization. SRP is more robust under high reverberation and multi-target conditions, while TDOA is less computationally intensive. This thesis introduces a modified TDOA algorithm, TDOA delay table search (TDOA-DTS), that has more stable performance than the original TDOA, and requires only 4% of the SRP computation load for a 3-dimensional space of a typical room. A 2-step adaptive thresholding procedure based on a Weibull noise peak distributions for the cross-correlations and a binomial distribution for combing potential peaks over all microphone pairs for the final detection. The first threshold limits the potential target peaks in the microphone pair cross-correlations with a user-defined false-alarm (FA) rates. The initial false-positive peak rate can be set to a higher level than desired for the final FA target rate so that high accuracy is not required of the probability distribution model (where model errors do not impact FA rates as they work for threshold set deep into the tail of the curve). The final FA rate can be lowered to the actual desired value using an M out of N (MON) rule on significant correlation peaks from different microphone pairs associated is a point in the space of interest. The algorithm is tested with simulated and real recorded data to verify resulting FA rates are consistent with the user-defined rates down to 10-6.
|
18 |
ISSUES IMPACTING CONTINUOUS PULSE OXIMETRY MONITORING AND WIRELESS CLINICIAN NOTIFICATION SYSTEM AFTER SURGERY / EVALUATION OF ISSUES IMPACTING WIRELESS CLINICIAN NOTIFICATION SYSTEM IN A RANDOMIZED CONTROL TRIAL INVOLVING POSTOPERATIVE VITAL SIGNS MONITORING AND CONTINUOUS PULSE OXIMETRYHarsha, Prathiba January 2019 (has links)
Background: The VItal siGns monItoring with continuous puLse oximetry And wireless cliNiCian notification aftEr surgery (VIGILANCE) study was a randomized controlled trial designed to assess the impact of continuous vital sign monitoring with alerts to nursing staff on the incidence of postoperative complications in surgical ward patients. Multiple factors interfered with the eHealth intervention implementation and conduct of the VIGILANCE study. Through examination of these challenges, the overall aim of this thesis was to help foster an understanding of the difficulties related to eHealth intervention implementation. The specific objectives were to identify issues related to implementation of intervention system of the VIGILANCE study, and to evaluate the influence of these issues on intervention adoption.
Methods: During the VIGILANCE study, issues affecting the implementation of the intervention were documented on case report forms, alarm event forms, and a nursing feedback questionnaire. In this thesis, the issues were identified and evaluated using the Clinical Adoption Framework.
Results: The key issues identified include nursing workflow changes, patient withdrawal, wireless network connectivity, false alarms, monitor malfunction, probe issues, and wireless network standards. These issues affected the service, system and information quality. As a result, these issues impacted ‘access’ through decreased ability of nurses to make complete use of the monitors; ‘care quality’ of the trial intervention through decreased effectiveness; and ‘productivity’ through interference in the coordination of care, and thus decreased clinical adoption of the monitoring system.
Conclusion: Patient monitoring with eHealth technology in surgical wards has the potential to improve patient outcomes. However, proper planning that includes engagement of front-line nurses, installation of appropriate wireless network infrastructure, and use of comfortable cableless devices are required to maximize the potential of continuous monitoring. / Thesis / Master of Science (MSc) / The VIGILANCE study was a randomized controlled trial assessing the impact of continuous vital signs monitoring with alerts to nurses on the incidence of postoperative complications in surgical ward patients. This thesis identified and evaluated issues with implementation of wireless monitoring systems in the hospital. During VIGILANCE study issues affecting the intervention implementation were documented on case report forms, alarm event forms, and nursing questionnaires. Data related to these issues were explored using the Clinical Adoption Framework. Identified issues included nursing workflow changes, patient withdrawal, wireless network connectivity, false alarms, monitor malfunction, probe issues, and wireless network standards. The issues affected ‘access’ through decreased ability of nurses to make complete use of the monitors; ‘care quality’ of the intervention through decreased effectiveness; and ‘productivity’ by interfering in the care coordination. Future studies should aim to include front-line nurses, appropriate wireless network, and comfortable cableless devices in their planning.
|
19 |
Analytic Assessment of Collision Avoidance Systems and Driver Dynamic Performance in Rear-End Crashes and Near-CrashesMcLaughlin, Shane Brendan 10 December 2007 (has links)
Collision avoidance systems (CASs) are being developed and fielded to reduce the number and severity of rear-end crashes. Kinematic algorithms within CASs evaluate sensor input and apply assumptions describing human-response timing and deceleration to determine when an alert should be presented. This dissertation presents an analytic assessment of dynamic function and performance CASs and associated driver performance for preventing automotive rear-end crashes. A method for using naturalistic data in the evaluation of CAS algorithms is described and applied to three algorithms. Time-series parametric data collected during 13 rear-end crashes and 70 near-crashes are input into models of collision avoidance algorithms to determine when the alerts would have occurred. Algorithm performance is measured by estimating how much of the driving population would be able to respond in the time available between when an alert would occur and when braking was needed. A sensitivity analysis was performed to consider the effect of alternative inputs into the assessment method. The algorithms were found to warn in sufficient time to permit 50–70% of the population to avoid collision in similar scenarios. However, the accuracy of this estimate was limited because the tested algorithms were found to alert too frequently to be feasible. The response of the assessment method was most sensitive to differences in assumed response-time distributions and assumed driver braking levels. Low-speed crashes were not addressed by two of the algorithms. Analysis of the events revealed that the necessary avoidance deceleration based on kinematics was generally less than 2 s in duration. At the time of driver response, the time remaining to avoid collision using a 0.5g average deceleration ranged from â 1.1 s to 2.1 s. In 10 of 13 crashes, no driver response deceleration was present. Mean deceleration for the 70 near-crashes was 0.37g and maximum was 0.72g. A set of the events was developed to measure driver response time. The mean driver response time was 0.7 s to begin braking and 1.1 s to reach maximum deceleration. Implications for collision countermeasures are considered, response-time results are compared to previous distributions and future work is discussed. / Ph. D.
|
20 |
Radar Target Detection In Non-gaussian ClutterDoyuran, Ulku 01 September 2007 (has links) (PDF)
In this study, novel methods for high-resolution radar target detection in non-Gaussian clutter environment are proposed. In solution of the problem, two approaches are used: Non-coherent detection that operates on the envelope-detected signal for thresholding and coherent detection that performs clutter suppression, Doppler processing and thresholding at the same time. The proposed non-coherent detectors, which are designed to operate in non-Gaussian and range-heterogeneous clutter, yield higher performance than the conventional methods that were designed either for Gaussian clutter or heterogeneous clutter. The proposed coherent detector exploits the information in all the range cells and pulses and performs the clutter reduction and thresholding simultaneously. The design is performed for uncorrelated, partially correlated and fully correlated clutter among range cells. The performance analysis indicates the superiority of the designed methods over the classical ones, in fully correlated and partially correlated situations. In addition, by design of detectors for multiple targets and making corrections to the conventional methods, the target-masking problem of the classical detectors is alleviated.
|
Page generated in 0.0359 seconds