• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 22
  • 19
  • 11
  • 8
  • 7
  • 4
  • 4
  • 4
  • 2
  • 2
  • Tagged with
  • 100
  • 100
  • 37
  • 25
  • 22
  • 18
  • 16
  • 15
  • 15
  • 14
  • 13
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Guidelines for a remedial reading programme for standard one and two pupils

Nel, Norma 01 1900 (has links)
A synopsis of the importance and the nature of reading serve as the point of departure for this study. The pupils involved are learning restrained as well as A comprehensive reading problem analysis table, compiled for analysis of individual reading problems, facilitates identification of the remedial reading areas, as well as the underlying subskills causing the problems to be accommodated in remedial reading. A control chart, developed for recording the information concerning the pupil's reading problem area and underlying subskills, facilitates compilation of an integrated remedial reading programme. Existing exercises, selected from the works of various authors and adapted, provide guidelines and exercises for particular remedial reading areas. These guidelines serve as a point of departure for the compilation of a specific remedial reading programme for a particular pupil with reading problems. Two case studies elucidate how a remedial reading programme can be compiled according to the pupil's background, reading problems and inadequacies in the underlying subskills. Group A learning disabled pupils although learning disabled pupils in Group B and C can also be involved. The total reading process is illustrated by means of a reading model. The two main components, namely, word identification and comprehension, form the basis of this study. The different subcategories featuring in each component are highlighted. This model serves as a framework for the diagnosis and remediation of reading problems. A teaching model is used to illustrate the complexity of teaching. The factors ( within the teaching model are indicated, as well as the ways they may serve when reading is taught. The reduction and choice of reading content for a specific pupil are set out as important aspects to be taken into consideration in reading remediation. Determining each pupil's reading levels, namely, his/her independent level, instructional level and frustrational level, enables the teacher to choose the appropriate reading material. / Teacher Education / D. Ed. (Orthopedagogics)
82

利用調適性管制技術同時監控製程平均數和變異數 / Joint Monitoring of Process Means and Variances by Using Adaptive Control Schemes

陳琬昀 Unknown Date (has links)
由近期的研究中發現變動所有參數的管制圖在偵測小幅度偏移時的速度比起傳統的舒華特管制圖來的快,許多文獻也討論到利用調適性管制技術同時監控製程的平均數和變異數。而在這份研究中,為了改善現有管制圖的偵測效率,依序提出了U-V管制圖以及Max-M管制圖來偵測單一製程與兩相依製程的平均數和變異數。採用AATS及ANOS來衡量管制圖的偵測績效,並利用馬可夫鏈推導計算得之。透過兩階段的範例來介紹所提出的管制圖的應用方法並將VP U-V管制圖、VP Max-M管制圖與FP Z(X-bar)-Z(Sx^2)管制圖加以比較。從所研究的數值分析中發現VP Max-M管制圖比另兩種管制圖的表現來的好,再加上只需要單一管制圖在使用上對工程師來說也較為簡便,因此建議Max-M管制圖値得在實務上被使用。 / Recent studies have shown that the variable parameters (VP) charts detect small process shifts faster than the traditional Shewhart charts. There have been many papers discussed adaptive control schemes to monitor process mean and variance simultaneously. In the study, to improve the efficiency and performance of the existing control charts, the U-V control charts and Max-M control charts are respectively proposed to monitor the process mean and variance for a single process and two dependent process steps. The performance of the proposed control charts is measured by using adjusted average time to signal (AATS) and average number of observations to signal (ANOS). The calculation of AATS and ANOS is derived by Markov chain approach. The application of the proposed control charts is illustrated by a numerical example for two dependent process steps, and the performance of VP U-V control charts, VP Max-M control charts and FP Z(X-bar)-Z(Sx^2) control charts is compared. From the results of data analyses, it shows that the VP Max-M control charts have better performance than VP U-V control charts and FP Z(X-bar)-Z(Sx^2) control charts. Furthermore, using a single chart to monitor a process is easier than using two charts for engineers. Hence, Max-M control charts are recommended in real industrial process.
83

Systematisk bearbetning, kommunikation och visualisering av mätutfall vid koordinatmätning / Systematic processing, communication and visualization of outcome from coordinate measurement

Wettrén, Freddie January 2019 (has links)
The purpose of this thesis is to develop a method of systematic visualization for coordinate measurements at Saab Aerostructures. The thesis begins with mapping the coordinate measuring process. After that, it is proposed to define key measurement points through quality standards as this will simplify the visualization of the measurements with a control chart through a Visual Management-system.  To stay in the forefront of the market and keep the relations with current customers, Saab wants to increase the quality of their products by reducing the amount of measurement deviations. The measurements are currently scarcely visualized to the employees. This is because of the complicated method which currently is being used to reflect which measurements that are deviating. The process of visualization should be definite and simple to use as this will lead to less deviations. Furthermore, it will contribute to continuous improvement for the process of measuring.  The methods used in this thesis include semi structured interviews, observations and studies of internal documents. These methods are then triangulated to be able to map the process in detail. However, no employees knew either the coordinate measuring process or the visualizing process by heart. This led to the author having to proceed with a Process Walk. The method helped map both the processes and through this, problems were identified in the process which would otherwise hinder the visualization of measurements. Moreover, a literature study was conducted of Quality Management Systems as these outline criteria for Saabs products. Visual Management-system was also studied with the likes of information boards as this fit Saab with their previous prerequisites of information boards.  From the mapping of today’s process, problems were identified that would hinder them from systematically visualize their coordinate measuring process. From the problems that were outlined, improvement suggestions were proposed. Firstly, it was conducted that too much data is hindering the transparency of the process. From Quality Standards it is proposed to clearly define key measurements points and document these. This is done through the means of a SIPOC and a FMEA. Furthermore, it is proposed to analyze which measurement data is to be saved as a major part of the data consists of irrelevant information. As the process of visualizing the coordinate measuring process was timed to 90 hours it was also proposed to clarify templates which systematically exports and sorts data in an Excel-format with the use of Saabs ERP-system, instead of converting from XML which is done today. These templates can then immediately be used by Excel to visualize the outcome of the coordinate measurements with the help of a Visual Management-system. The system makes use of bar diagrams to highlight which processes that are producing most deviations. The processes outcome is visualized in a control chart which enhances how the variation of the process is affecting the measurement outcome. The last improvement suggestion states that Saab should use the control chart as a way of stabilizing their processes as this will lead to qualitative processes where capability can be proven to their customers. / Syftet med detta arbete är att utveckla ett arbetssätt för systematisk visualisering av mätutfallet vid koordinatmätning på Saab Aerostructures. Arbetet lägger sin grund i en kartläggning av koordinat-mätningsprocessen för att sedan föreslå att utefter rådande kvalitetsstandarder tydligt definiera nyckelmått som skall förenkla visualisering av mätutfallet med styrdiagram i ett Visual Management-system.  För att Saab ska ligga i framkant av marknaden och behålla de kundrelationer man har ska man öka kvaliteten på produkter genom att minska antalet anmärkningar. Mätutfallet och dess anmärkningar är föga visualiserade för anställda då en komplicerad process krävs för att återspegla vilka mätpositioner som skapar problem. Om processen för visualisering blir konkret och enkel att använda kommer det att bidra till färre anmärkningar och kontinuerlig förbättring av kontrollmätningsprocessen.  Metoder som användes för arbetet inkluderar halvstrukturerade intervjuer, observationer och studier av interna dokument. Dessa metoder trianguleras sedan för att detaljerat kartlägga processen men på grund av att inte tillräckligt många kunde beskriva kontrollmätnings- eller visualiseringsprocessen genomfördes en ytterligare metod, en Process walk. Metoden hjälpte att detaljerat kartlägga kontrollmätnings-processen och dess nuvarande visualiseringsmetod. Från kartläggningen identifierades problem och barriärer för visualisering av data vilka annars skulle hindra visualiseringen av mätutfallet. Utöver dessa metoder genomfördes även en litteraturstudie av Kvalitetsledningssystem då dessa svarar för kriterier för Saabs produkter. Även Visual Management-system studerades med inriktning på informationstavlor då detta passade Saabs förutsättningar med tidigare informationstavlor.   Från kartläggningen av dagens process identifierades problem och barriärer vilka hindrar Saab från att systematiskt visualisera deras kontrollmätningsprocess. Från dessa problem och barriärer utvecklades förbättringsförslag. För det första förbättringsförslaget insågs att alldeles för mycket data hindrar överblickbarheten för processen varpå förslag gavs om att utefter kvalitetsstandarder, definiera nyckelmått och dokumentera dessa för alla produkter som kontrollmäts. Definieringen genomförs med en SIPOC och en FMEA. Vidare föreslås även att se över all lagring av mätdata då data till större del innehåller oviktig information som ändå förkastas. Då processen för visualisering av kontrollmätningsprocessen visade sig ta upp mot 90 timmar föreslogs även att förtydliga mallar som systematiskt exporterar och sorterar data i xlsx-format mot Saabs ERP-system, istället för att först konvertera från XML-format. Dessa mallar används sedan av Excel för att framställa mätutfallet av koordinatmätningarna med hjälp av ett Visual Management-system. Systemet använder sig av stapeldiagram för att förtydliga vilka processer som påvisar flest anmärkningar. Processernas mätutfall visualiseras i ett styrdiagram vilket förtydligar hur variationen i processer påverkar mätutfallet. Det sista förbättringsförslaget är att Saab ska börja använda sig av styrdiagram för att stabilisera deras processer för att få kvalitativa processer där man kan påvisa duglighet för sina kunder.
84

Avaliação da efetividade de cartas de controle multivariadas na detecção de suspeitas de fraude financeira

Souza, Davenilcio Luiz de 13 March 2017 (has links)
Submitted by JOSIANE SANTOS DE OLIVEIRA (josianeso) on 2017-05-19T12:43:37Z No. of bitstreams: 1 Davenilcio Luiz de. Souza_.pdf: 539499 bytes, checksum: cf86851f0b7523f3b7d78589539fdbcb (MD5) / Made available in DSpace on 2017-05-19T12:43:37Z (GMT). No. of bitstreams: 1 Davenilcio Luiz de. Souza_.pdf: 539499 bytes, checksum: cf86851f0b7523f3b7d78589539fdbcb (MD5) Previous issue date: 2017-03-13 / Nenhuma / Os crimes de lavagem de dinheiro têm provocado grandes perdas aos países e a seus sistemas financeiros, o volume de dados em transações digitais representa dificuldade para a detecção deste tipo de ilícito. As auditorias em dados financeiros mostram-se limitadas na identificação de fraudes, pois em grande parte, ainda são realizadas com dados coletados por amostragem e incapazes de identificar as situações de delito em tempo real. Este trabalho, visando auxiliar no atendimento a esta lacuna, tem por objetivo propor um método estatístico de monitoramento por Cartas de Controle multivariadas, com base na Lei de Benford, para a detecção de suspeitas de fraude em lançamentos financeiros, entre eles os devidos à lavagem de dinheiro. Foi definido um modelo conceitual com distribuição de probabilidades representando dados oriundos de lançamentos financeiros, e adotada a suposição de que aderem a distribuição da Lei de Benford. Posteriormente foi considerada a distribuição empírica, estimada a partir dos próprios dados e dois procedimentos foram testados para verificar as suspeitas de fraude por lavagem de dinheiro utilizando a avaliação dos primeiros dígitos significativos: A Carta de Controle multivariada _2 e a Carta de Controle multivariada T2 de Hotelling. Foram simulados dados com auxílio do software R-Project até a ocorrência do 50.000o sinal. Foram avaliados casos simulados e reais, com o fim de exemplificar a operação do método. A partir da simulação, as duas Cartas de Controle testadas foram avaliadas quanto ao ARL, isto é, o número médio de observações até sinalizar que a série passou a operar em um estado fora de controle, o que significa a suspeita de lançamentos fraudulentos. Após aplicação do método de análise retrospectiva, com base nas proporções dos primeiros dígitos de Benford em lançamentos financeiros da campanha para Prefeito em 2016, não foram evidenciadas suspeitas de fraude nos dados obtidos junto ao sítio do Tribunal Superior Eleitoral (TSE). Em um conjunto de dados de uma instituição financeira, foram observados sinais de divergência entre as frequências dos primeiros dígitos nos lançamentos e nos valores esperados, porém os pontos além dos limites de controleidentificados encontram-se em um período próximo nas três análises realizadas, concentrando os dados de investigação para a auditoria financeira. A contribuição acadêmica deu-se pelo desenvolvimento de um modelo de aplicação de Cartas de Controle multivariadas e da Lei de Benford, com uma abordagem inovadora do controle estatístico de processos voltado à área financeira, utilizando recurso computacional acessível, de fácil processamento, confiável e preciso, que permite aprimoramento por novas abordagens acadêmicas. No que tange à contribuição à sociedade, se dá pelo uso do modelo por entidades que atuam com movimentações financeiras e pela comunidade, em dados de organizações civis e estatais divulgados nos canais de informação, de modo a proporcionar a prática cidadã pelo acesso à análise e a constatação da idoneidade dos fatos e dos dados. / Large losses are generated in the countryes financial systems, by money laundering. The volume of financial data is big issue to identify digital crime and money laundering. Audits in financial data have limitations in detecting fraud, in large part it is still performed in a traditional way, data are collected by sampling and often unable to identify a real-time crime situation. This research is aiming to serve in addressing this gap, to propose an monitoring statistical method, from multivariate control chart based on Benford’s law for detecting suspicious of fraud in financial data, including those due to money laundering. It was initially defined as a conceptual model in order to determine the type of probability distribution that represents data from financial launches. It was adopted an assumption that this type of data adheres to the Benford’s Law distribution. Subsequently, an empirical distribution was obtained, estimated from the own data. Two procedures were tested to verify a suspected money laundering fraud through the significant first-digit assessment: The Multivariate 2 Control Chart and the Multivariate Hotelling’s T2 Control Chart. Data were simulated using the R-Project software until the occurrence of the 50.000o signal. Finally, the simulation procedures were applied to real data in order to exemplify the method operationally. From the simulation, the two Control Charts tested were evaluated for ARL, that is, average number of observations until the signaling that the series started to operate in an out-of-control state, which it means suspicious of fraudulent launches. The application of the retrospective analysis method in the financial launchings of county’s campaign from 2016 Elections in five capitals of Brazil, based on the expected proportions from the first digit given by Benford’s Law, no suspicions fraud were evidenced in the data obtained from the site of Tribunal Superior Eleitoral (TSE). Considering the application in a set of data from a financial institution, signs of divergence between the frequencies of the first digits of the entries and the expected values were observed, but these points beyond the identified limits are close in all three analyzes. Indicating the period of the data which ones the audit will focus in a further investigation. Academic contribution is identified by developing a multivariate Control Chart together the Benford’s law in an application model with an innovative approach to the statistical process control aimed at the financial area,using accessible, easy to process, reliable and accurate computational resources that allow improvement through new academic approaches. As regard to the contribution to society, it is given the opportunity of applying the model by financial entities and the community in the data of civil and state organizations, disclosed in the information channels in order to provide access to analysis and verification of the suitability of facts and data by citizen practice.
85

Gráficos de controle fuzzy para o monitoramento da média e amplitude de processos univariados /

Mendes, Amanda dos Santos January 2019 (has links)
Orientador: Marcela Aparecida Guerreiro Machado Freitas / Resumo: O controle de qualidade, principalmente por meio do uso de gráficos de controle, torna-se essencial na indústria para garantir um processo livre de causas especiais de variabilidade. Como os dados amostrais podem incluir incertezas advindas da subjetividade humana e dos sistemas de medição, a teoria dos conjuntos fuzzy pode ser aplicada aos gráficos de controle quando dados precisos não estiverem disponíveis. Para tal feito, os valores da característica de qualidade são fuzzificados a partir da inserção de incertezas e transformados em valores representativos para uma melhor comparação com o gráfico de controle tradicional. Este trabalho propõe o uso da lógica fuzzy aos gráficos de controle para monitorar a média e a amplitude de processos univariados, assim como dois gráficos de controle fuzzy baseados nas regras especiais de decisão: Synthetic e Side Sensitive Synthetic. O desempenho do gráfico de controle é medido pelo número médio de amostras até sinal (NMA). Verificou-se neste trabalho que os gráficos de controle fuzzy possuem maior eficiência que os gráficos de controle tradicionais para menores valores de α-cut, ou seja, maior incerteza inserida no processo e para cenários onde se tem uma maior diferença entre os limitantes de incerteza dos números fuzzy. / Abstract: Quality control, mainly through the use of control charts, becomes essential in the industry to ensure a process free from special causes of variability. As sample data may include uncertainties arising from human subjectivity and measurement systems, fuzzy set theory can be applied to control charts when accurate data is not available. For this purpose, the quality characteristic values are fuzzified from the insertion of uncertainties and transformed into representative values for a better comparison with the traditional control chart. This work proposes the use of fuzzy logic to control charts to monitor the mean and range of univariate processes, as well as two fuzzy control charts based on the special run rules: Synthetic and Side Sensitive Syntehtic. The performance of the control chart is measured by the average run length (ARL). It was verified in this work that the fuzzy control charts have higher efficiency than the traditional control charts for lower values of α-cut, that is, greater uncertainty inserted in the process and for scenarios where there is a greater difference between the limiting uncertainties of fuzzy numbers. / Mestre
86

時間序列在品質管制上的應用 / Apply time series to quality control

陳繼書, Chen, Gi Sue Unknown Date (has links)
當我們利用Shewhart管制圖(Shewhart control chart)或累積和管制圖(Cumulative-sum chart. CUSUM chart)來偵測製程時,通常假設製品係獨立取自一個服從均數μ和標準差為σ的獨立常態分配的管制下進行。但是若產品特性值呈現自相關時,這類管制圖就可能發生誤導的結果。本文利用時間序列模式來解決具相關變數的管制圖問題。並考慮利用非線性時間序列模式及特別原因管制圖(special-cause control chart)來檢視台灣經濟景氣指標是否處於控制中的狀態。並討論特別原因管制圖的連串長度分佈(run length distribution)。在最後的實例分析中,介紹自動控制的觀念。 / Traditionally, in the quality control process, such as: Shewhart control chart or CUSUM chart, it is assumed that the observation process follows an i.i.d normal distribution. If the assumption for independence fails, that is when the process exhibits type of autocorrelation, we need to find a more reliable decision method. In this paper, we will apply the time series analysis and structure changed concept to slove the serial correlation problem. The idea of automatic control can be applied in the explanation of this nonlinear process. Finally, a time series about the monitoring indicators of Taiwan is discussed in detail as an example.
87

System Availability Maximization and Residual Life Prediction under Partial Observations

Jiang, Rui 10 January 2012 (has links)
Many real-world systems experience deterioration with usage and age, which often leads to low product quality, high production cost, and low system availability. Most previous maintenance and reliability models in the literature do not incorporate condition monitoring information for decision making, which often results in poor failure prediction for partially observable deteriorating systems. For that reason, the development of fault prediction and control scheme using condition-based maintenance techniques has received considerable attention in recent years. This research presents a new framework for predicting failures of a partially observable deteriorating system using Bayesian control techniques. A time series model is fitted to a vector observation process representing partial information about the system state. Residuals are then calculated using the fitted model, which are indicative of system deterioration. The deterioration process is modeled as a 3-state continuous-time homogeneous Markov process. States 0 and 1 are not observable, representing healthy (good) and unhealthy (warning) system operational conditions, respectively. Only the failure state 2 is assumed to be observable. Preventive maintenance can be carried out at any sampling epoch, and corrective maintenance is carried out upon system failure. The form of the optimal control policy that maximizes the long-run expected average availability per unit time has been investigated. It has been proved that a control limit policy is optimal for decision making. The model parameters have been estimated using the Expectation Maximization (EM) algorithm. The optimal Bayesian fault prediction and control scheme, considering long-run average availability maximization along with a practical statistical constraint, has been proposed and compared with the age-based replacement policy. The optimal control limit and sampling interval are calculated in the semi-Markov decision process (SMDP) framework. Another Bayesian fault prediction and control scheme has been developed based on the average run length (ARL) criterion. Comparisons with traditional control charts are provided. Formulae for the mean residual life and the distribution function of system residual life have been derived in explicit forms as functions of a posterior probability statistic. The advantage of the Bayesian model over the well-known 2-parameter Weibull model in system residual life prediction is shown. The methodologies are illustrated using simulated data, real data obtained from the spectrometric analysis of oil samples collected from transmission units of heavy hauler trucks in the mining industry, and vibration data from a planetary gearbox machinery application.
88

System Availability Maximization and Residual Life Prediction under Partial Observations

Jiang, Rui 10 January 2012 (has links)
Many real-world systems experience deterioration with usage and age, which often leads to low product quality, high production cost, and low system availability. Most previous maintenance and reliability models in the literature do not incorporate condition monitoring information for decision making, which often results in poor failure prediction for partially observable deteriorating systems. For that reason, the development of fault prediction and control scheme using condition-based maintenance techniques has received considerable attention in recent years. This research presents a new framework for predicting failures of a partially observable deteriorating system using Bayesian control techniques. A time series model is fitted to a vector observation process representing partial information about the system state. Residuals are then calculated using the fitted model, which are indicative of system deterioration. The deterioration process is modeled as a 3-state continuous-time homogeneous Markov process. States 0 and 1 are not observable, representing healthy (good) and unhealthy (warning) system operational conditions, respectively. Only the failure state 2 is assumed to be observable. Preventive maintenance can be carried out at any sampling epoch, and corrective maintenance is carried out upon system failure. The form of the optimal control policy that maximizes the long-run expected average availability per unit time has been investigated. It has been proved that a control limit policy is optimal for decision making. The model parameters have been estimated using the Expectation Maximization (EM) algorithm. The optimal Bayesian fault prediction and control scheme, considering long-run average availability maximization along with a practical statistical constraint, has been proposed and compared with the age-based replacement policy. The optimal control limit and sampling interval are calculated in the semi-Markov decision process (SMDP) framework. Another Bayesian fault prediction and control scheme has been developed based on the average run length (ARL) criterion. Comparisons with traditional control charts are provided. Formulae for the mean residual life and the distribution function of system residual life have been derived in explicit forms as functions of a posterior probability statistic. The advantage of the Bayesian model over the well-known 2-parameter Weibull model in system residual life prediction is shown. The methodologies are illustrated using simulated data, real data obtained from the spectrometric analysis of oil samples collected from transmission units of heavy hauler trucks in the mining industry, and vibration data from a planetary gearbox machinery application.
89

應用資料包絡法降低電源轉換器溫升之研究

廖 合, Liao,Ho Unknown Date (has links)
由績效觀點,品質(適質)與成本(適量),在概念上是完全一致的。因此,績效的管理,應以品質與成本作為其目標達成與否的衡量標準。本研究以績效觀點來解決公司面臨到品質與成本的兩難的問題。隨著電子產品的功能多樣化,發熱問題卻接踵而來,發熱密度的不斷提昇,對於散熱設計的需求也越來越受到重視。本研究以電源轉換器為對象,其目前已設計完成且已通過美國UL安規認證,但因為其溫升及其變異很大,因此降低電源轉換器的溫升及其變異是一急需解決的問題,以期能找出穩健於不可控因子,使產品變異小且各零件溫升與損失均能降至最低的最適外部零件組合。透過了田口與實驗設計的方法規劃及進行實驗並收集數據。引用加權SN比(multi-response S/N ratio)的方法,分別透過(1)管制圖法及(2)資料包絡法的CCR保證領域法(指CCR-AR模型)來決定加權SN比的權數,以決定可控因子及其水準值。對矩陣實驗的數據利用MTS ( M a h a l o n o b I s - Taguchi System)來篩選研究問題中較重要的特性變數,再針對篩選結果中較重要的特性變數的數據分別利用(1)倒傳遞類神經網路結合資料包絡法及(2)資料包絡法結合主成份分析法兩種分析方法,得到外殼鑽孔形狀與矽膠片大小的最佳因子組合。由改善後的確認實驗結果得知,雖然平均溫升下降的程度不大,然而大部份量測點的溫升標準差都顯著變小了,因此本研究在降低該電源轉換器溫升變異的效果顯著。
90

Development of statistical methods for the surveillance and monitoring of adverse events which adjust for differing patient and surgical risks

Webster, Ronald A. January 2008 (has links)
The research in this thesis has been undertaken to develop statistical tools for monitoring adverse events in hospitals that adjust for varying patient risk. The studies involved a detailed literature review of risk adjustment scores for patient mortality following cardiac surgery, comparison of institutional performance, the performance of risk adjusted CUSUM schemes for varying risk profiles of the populations being monitored, the effects of uncertainty in the estimates of expected probabilities of mortality on performance of risk adjusted CUSUM schemes, and the instability of the estimated average run lengths of risk adjusted CUSUM schemes found using the Markov chain approach. The literature review of cardiac surgical risk found that the number of risk factors in a risk model and its discriminating ability were independent, the risk factors could be classified into their "dimensions of risk", and a risk score could not be generalized to populations remote from its developmental database if accurate predictions of patients' probabilities of mortality were required. The conclusions were that an institution could use an "off the shelf" risk score, provided it was recalibrated, or it could construct a customized risk score with risk factors that provide at least one measure for each dimension of risk. The use of report cards to publish adverse outcomes as a tool for quality improvement has been criticized in the medical literature. An analysis of the report cards for cardiac surgery in New York State showed that the institutions' outcome rates appeared overdispersed compared to the model used to construct confidence intervals, and the uncertainty associated with the estimation of institutions' out come rates could be mitigated with trend analysis. A second analysis of the mortality of patients admitted to coronary care units demonstrated the use of notched box plots, fixed and random effect models, and risk adjusted CUSUM schemes as tools to identify outlying hospitals. An important finding from the literature review was that the primary reason for publication of outcomes is to ensure that health care institutions are accountable for the services they provide. A detailed review of the risk adjusted CUSUM scheme was undertaken and the use of average run lengths (ARLs) to assess the scheme, as the risk profile of the population being monitored changes, was justified. The ARLs for in-control and out-of-control processes were found to increase markedly as the average outcome rate of the patient population decreased towards zero. A modification of the risk adjusted CUSUM scheme, where the step size for in-control to out-of-control outcome probabilities were constrained to no less than 0.05, was proposed. The ARLs of this "minimum effect" CUSUM scheme were found to be stable. The previous assessment of the risk adjusted CUSUM scheme assumed that the predicted probability of a patient's mortality is known. A study of its performance, where the estimates of the expected probability of patient mortality were uncertain, showed that uncertainty at the patient level did not affect the performance of the CUSUM schemes, provided that the risk score was well calibrated. Uncertainty in the calibration of the risk model appeared to cause considerable variation in the ARL performance measures. The ARLs of the risk adjusted CUSUM schemes were approximated using simulation because the approximation method using the Markov chain property of CUSUMs, as proposed by Steiner et al. (2000), gave unstable results. The cause of the instability was the method of computing the Markov chain transition probabilities, where probability is concentrated at the midpoint of its Markov state. If probability was assumed to be uniformly distributed over each Markov state, the ARLs were stabilized, provided that the scores for the patients' risk of adverse outcomes were discrete and finite.

Page generated in 0.0397 seconds