• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 7
  • 6
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 50
  • 22
  • 17
  • 12
  • 11
  • 10
  • 8
  • 7
  • 7
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

On Development and Performance Evaluation of Some Biosurveillance Methods

Zheng, Hongzhang 09 August 2011 (has links)
This study examines three applications of control charts used for monitoring syndromic data with different characteristics. The first part develops a seasonal autoregressive integrated moving average (SARIMA) based surveillance chart, and compares it with the CDC Early Aberration Reporting System (EARS) W2c method using both authentic and simulated data. After successfully removing the long-term trend and the seasonality involved in syndromic data, the performance of the SARIMA approach is shown to be better than the performance of the EARS method in terms of two key surveillance characteristics, the false alarm rate and the average time to detect the outbreaks. In the second part, we propose a generalized likelihood ratio (GLR) control chart to detect a wide range of shifts in the mean of Poisson distributed biosurveillance data. The application of a sign function on the original GLR chart statistics leads to downward-sided, upward-sided, and two-sided GLR chart statistics in an unified framework. To facilitate the use of such charts in practice, we provide detailed guidance on developing and implementing the GLR chart. Under the steady-state framework, this study indicates that the overall GLR chart performance in detecting a range of shifts of interest is superior to the performance of traditional control charts including the EARS method, Shewhart charts, EWMA charts, and CUSUM charts. There is often an excessive number of zeros involved in health care related data. Zero-inflated Poisson (ZIP) models are more appropriate than Poisson models to describe such data. The last part of the dissertation considers the GLR chart for ZIP data under a research framework similar to the second part. Because small sample sizes may influence the estimation of ZIP parameters, the efficiency of MLEs is investigated in depth, followed by suggestions for improvement. Numerical approaches to solving for the MLEs are discussed as well. Statistics for a set of GLR charts are derived, followed by modifications changing them from two-sided statistics to one-sided statistics. Although not a complete study of GLR charts for ZIP processes, due to limited time and resources, suggestions for future work are proposed at the end of this dissertation. / Ph. D.
32

Detection of the Change Point and Optimal Stopping Time by Using Control Charts on Energy Derivatives

AL, Cihan, Koroglu, Kubra January 2011 (has links)
No description available.
33

Anomaly Detection for Portfolio Risk Management : An evaluation of econometric and machine learning based approaches to detecting anomalous behaviour in portfolio risk measures / Avvikelsedetektering för Riskhantering av Portföljer : En utvärdering utav ekonometriska och maskininlärningsbaserade tillvägagångssätt för att detektera avvikande beteende hos portföljriskmått

Westerlind, Simon January 2018 (has links)
Financial institutions manage numerous portfolios whose risk must be managed continuously, and the large amounts of data that has to be processed renders this a considerable effort. As such, a system that autonomously detects anomalies in the risk measures of financial portfolios, would be of great value. To this end, the two econometric models ARMA-GARCH and EWMA, and the two machine learning based algorithms LSTM and HTM, were evaluated for the task of performing unsupervised anomaly detection on the streaming time series of portfolio risk measures. Three datasets of returns and Value-at-Risk series were synthesized and one dataset of real-world Value-at-Risk series had labels handcrafted for the experiments in this thesis. The results revealed that the LSTM has great potential in this domain, due to an ability to adapt to different types of time series and for being effective at finding a wide range of anomalies. However, the EWMA had the benefit of being faster and more interpretable, but lacked the ability to capture anomalous trends. The ARMA-GARCH was found to have difficulties in finding a good fit to the time series of risk measures, resulting in poor performance, and the HTM was outperformed by the other algorithms in every regard, due to an inability to learn the autoregressive behaviour of the time series. / Finansiella institutioner hanterar otaliga portföljer vars risk måste hanteras kontinuerligt, och den stora mängden data som måste processeras gör detta till ett omfattande uppgift. Därför skulle ett system som autonomt kan upptäcka avvikelser i de finansiella portföljernas riskmått, vara av stort värde. I detta syftet undersöks två ekonometriska modeller, ARMA-GARCH och EWMA, samt två maskininlärningsmodeller, LSTM och HTM, för ändamålet att kunna utföra så kallad oövervakad avvikelsedetektering på den strömande tidsseriedata av portföljriskmått. Tre dataset syntetiserades med avkastningar och Value-at-Risk serier, och ett dataset med verkliga Value-at-Risk serier fick handgjorda etiketter till experimenten i denna avhandling. Resultaten visade att LSTM har stor potential i denna domänen, tack vare sin förmåga att anpassa sig till olika typer av tidsserier och för att effektivt lyckas finna varierade sorters anomalier. Däremot så hade EWMA fördelen av att vara den snabbaste och enklaste att tolka, men den saknade förmågan att finna avvikande trender. ARMA-GARCH hade svårigheter med att modellera tidsserier utav riskmått, vilket resulterade i att den preseterade dåligt. HTM blev utpresterad utav de andra algoritmerna i samtliga hänseenden, på grund utav dess oförmåga att lära sig tidsserierna autoregressiva beteende.
34

Evaluation of US and European hedge funds and associated international markets : a risk-performance measure approach / Wilhelmine Helana Brand

Brand, Wilhelmine Helena January 2014 (has links)
The 2007–2009 financial crisis led to a decrease in consumer and investor confidence worldwide (SARB, 2008:2). Along with the weakened business sentiment and consumer demand, tightened funding conditions in financial markets, increased inflationary pressures, and declining global manufacturing activities, the world economic recession that followed the collapse of the world financial sector led to an estimated wealth destruction of approximately US$50 trillion (SARB, 2008:2; Aisen & Franken, 2010:3; Karunanayake et al., 2010). Apart from this estimate, the International Monetary Fund (IMF) also projected that the global bank balance sheets in advanced countries suffered losses of approximately US$4 trillion during the period 2009–2010 (Aisen & Franken, 2010:3). As a result, investors have become more risk-adverse (Guiso et al., 2013:1), and the consequences of the financial crisis, made insurable profitable investment decisions extremely difficult as market volatility tends to increase during crises periods (Karunanayake et al., 2010; Schwert, 1989:83). With the financial environment in distress, some fund managers consider equities as the preferred asset class to protect the purchasing power of their clients (Ivan, 2013). However, the studies of Ennis and Sebastian (2003) and Nicholas (2004) found evidence that hedge funds will outperform equity markets during a downswing in financial markets. In addition, hedge funds are considered market-neutral due to these investment funds’ unrestricted investment flexibility and more efficient market timing abilities (Ennis & Sebastian, 2003). Hedge funds are also considered to be more unconventional assets for improving portfolio diversification (Lamm, 1999:87), where the variation of investment strategies available in a hedge fund has the ability to satisfy investors with several different risk preferences (Shin, 2012). Still, a number of previous studies have debated conflicting evidence regarding the performance of hedge funds and the persistence in outperforming other markets. This led to the objective of this study; to evaluate the risk-adjusted performance of US and EU hedge funds compared to the associated world equity markets over the 2007–2009 financial crisis. The evidence from this study confirmed the dominance of hedge funds over the CAC 40, DAX, S&P 500 and Dow Jones, from 2004 to 2011, emphasising that the performance of the US and EU hedge funds would overshadow a normal buy-and-hold strategy on the world equity markets under investigation. Overall, the Sharpe-, Sortino-, Jensen’s alpha-, Treynor- and Calmar ratios illustrated that US hedge funds outperformed both EU hedge funds and the associated equity markets over this period. The presence of non-normality among the return distributions led to the use of the Omega ratio as the proper benchmark, which also confirmed the outperformance of US hedge funds over EU hedge funds and associated world equity markets. / MCom (Risk Management), North-West University, Vaal Triangle Campus, 2014
35

Evaluation of US and European hedge funds and associated international markets : a risk-performance measure approach / Wilhelmine Helana Brand

Brand, Wilhelmine Helena January 2014 (has links)
The 2007–2009 financial crisis led to a decrease in consumer and investor confidence worldwide (SARB, 2008:2). Along with the weakened business sentiment and consumer demand, tightened funding conditions in financial markets, increased inflationary pressures, and declining global manufacturing activities, the world economic recession that followed the collapse of the world financial sector led to an estimated wealth destruction of approximately US$50 trillion (SARB, 2008:2; Aisen & Franken, 2010:3; Karunanayake et al., 2010). Apart from this estimate, the International Monetary Fund (IMF) also projected that the global bank balance sheets in advanced countries suffered losses of approximately US$4 trillion during the period 2009–2010 (Aisen & Franken, 2010:3). As a result, investors have become more risk-adverse (Guiso et al., 2013:1), and the consequences of the financial crisis, made insurable profitable investment decisions extremely difficult as market volatility tends to increase during crises periods (Karunanayake et al., 2010; Schwert, 1989:83). With the financial environment in distress, some fund managers consider equities as the preferred asset class to protect the purchasing power of their clients (Ivan, 2013). However, the studies of Ennis and Sebastian (2003) and Nicholas (2004) found evidence that hedge funds will outperform equity markets during a downswing in financial markets. In addition, hedge funds are considered market-neutral due to these investment funds’ unrestricted investment flexibility and more efficient market timing abilities (Ennis & Sebastian, 2003). Hedge funds are also considered to be more unconventional assets for improving portfolio diversification (Lamm, 1999:87), where the variation of investment strategies available in a hedge fund has the ability to satisfy investors with several different risk preferences (Shin, 2012). Still, a number of previous studies have debated conflicting evidence regarding the performance of hedge funds and the persistence in outperforming other markets. This led to the objective of this study; to evaluate the risk-adjusted performance of US and EU hedge funds compared to the associated world equity markets over the 2007–2009 financial crisis. The evidence from this study confirmed the dominance of hedge funds over the CAC 40, DAX, S&P 500 and Dow Jones, from 2004 to 2011, emphasising that the performance of the US and EU hedge funds would overshadow a normal buy-and-hold strategy on the world equity markets under investigation. Overall, the Sharpe-, Sortino-, Jensen’s alpha-, Treynor- and Calmar ratios illustrated that US hedge funds outperformed both EU hedge funds and the associated equity markets over this period. The presence of non-normality among the return distributions led to the use of the Omega ratio as the proper benchmark, which also confirmed the outperformance of US hedge funds over EU hedge funds and associated world equity markets. / MCom (Risk Management), North-West University, Vaal Triangle Campus, 2014
36

Studies in the electrocardiogram monitoring indices.

Guo, Chin-yuan 16 July 2004 (has links)
An recent finding shows that heart rate data possess self-similar property, which is characterized by a parameter H, as well as a long range dependent parameter d. We estimate H by the EBP(Embedded Branching Process) method to derive the fractional parameter d in the first part. The heart rate and R-R interval data are found to have high differencing parameter(d=0.8 ~0.9) and against the normality assumption. Thus the heart rate and R-R interval data are first fractionally differenced of order 0.5 to achieve stationarity. In the second part, we analyze the RR-interval data on the physionet and obtain the long range parameters. After fractionally differencing 0.5 order, the EBP method is adapted to estimate the long range parameter d. The EWMA and EWRMS control charts of the I(d) processes are constructed to monitor the heart rate mean level and variability, respectively for the 18 RR-interval data sets from the physionet. For the EWMA control chart the out of control percentages are chosen to the nominal probability. However, the out of control percentages are affected by the skewness and kurtosis of the process distribution for the EWRMS control carts. Generally speaking, the I(d)-EWMA and I(d)-EWRMS control charts provide a proper monitor system for heart rate mean level and variability.
37

Online transaction simulation sysyem of the Taiwan Stock Exchange

Liu, Hui-Wen 23 July 2008 (has links)
Taiwan Security Market is a typical order-driven market, and the business transactions are matched through the electronic trading system since 1988. In this work, we study the joint distributions of tick size changes of bid price and ask price, bid volume, and ask volume¡@for each matching order in Taiwan Stock Exchange (TSEC). Exponentially weighted moving average (EWMA) method is adopted to update the joint distribution of the incoming order variables aforementioned. Here we propose five methods to determine the update timing and consider three different initial matrices of the joint distributions. In empirical study, the daily matching data of two enterprises Uni-president Enterprises Corporation and Formosa Plastics Corporation in April, 2005 are considered. The goodness of fit for the joint distributions are determined by Chi-square Goodness of Fit Test. The results show that EWMA method provide good fit for most of the daily transaction data.
38

Rizika použití VAR modelů při řízení portfolia / Risks of using VaR models for portfolio management

Antonenko, Zhanna January 2014 (has links)
The diploma thesis Risks of using VaR models for portfolio management is focused on estimation of the portfolio VaR using basic and modified methods. The goal of this thesis is to point out some weakness of the basic methods and to demonstrate the estimation of VaR using improved methods to overcome these problems. The analysis will be perform theoretically and in practice. Only market risk will be the subject of the study. Several simulation and parametric methods will be introduced.
39

A Naive, Robust and Stable State Estimate

Remund, Todd Gordon 18 June 2008 (has links) (PDF)
A naive approach to filtering for feedback control of dynamic systems that is robust and stable is proposed. Simulations are run on the filters presented to investigate the robustness properties of each filter. Each simulation with the comparison of the filters is carried out using the usual mean squared error. The filters to be included are the classic Kalman filter, Krein space Kalman, two adjustments to the Krein filter with input modeling and a second uncertainty parameter, a newly developed filter called the Naive filter, bias corrected Naive, exponentially weighted moving average (EWMA) Naive, and bias corrected EWMA Naive filter.
40

Process Monitoring with Multivariate Data:Varying Sample Sizes and Linear Profiles

Kim, Keunpyo 01 December 2003 (has links)
Multivariate control charts are used to monitor a process when more than one quality variable associated with the process is being observed. The multivariate exponentially weighted moving average (MEWMA) control chart is one of the most commonly recommended tools for multivariate process monitoring. The standard practice, when using the MEWMA control chart, is to take samples of fixed size at regular sampling intervals for each variable. In the first part of this dissertation, MEWMA control charts based on sequential sampling schemes with two possible stages are investigated. When sequential sampling with two possible stages is used, observations at a sampling point are taken in two groups, and the number of groups actually taken is a random variable that depends on the data. The basic idea is that sampling starts with a small initial group of observations, and no additional sampling is done at this point if there is no indication of a problem with the process. But if there is some indication of a problem with the process then an additional group of observations is taken at this sampling point. The performance of the sequential sampling (SS) MEWMA control chart is compared to the performance of standard control charts. It is shown that that the SS MEWMA chart is substantially more efficient in detecting changes in the process mean vector than standard control charts that do not use sequential sampling. Also the situation is considered where different variables may have different measurement costs. MEWMA control charts with unequal sample sizes based on differing measurement costs are investigated in order to improve the performance of process monitoring. Sequential sampling plans are applied to MEWMA control charts with unequal sample sizes and compared to the standard MEWMA control charts with a fixed sample size. The steady-state average time to signal (SSATS) is computed using simulation and compared for some selected sets of sample sizes. When different variables have significantly different measurement costs, using unequal sample sizes can be more cost effective than using the same fixed sample size for each variable. In the second part of this dissertation, control chart methods are proposed for process monitoring when the quality of a process or product is characterized by a linear function. In the historical analysis of Phase I data, methods including the use of a bivariate <i>T</i>² chart to check for stability of the regression coefficients in conjunction with a univariate Shewhart chart to check for stability of the variation about the regression line are recommended. The use of three univariate control charts in Phase II is recommended. These three charts are used to monitor the <i>Y</i>-intercept, the slope, and the variance of the deviations about the regression line, respectively. A simulation study shows that this type of Phase II method can detect sustained shifts in the parameters better than competing methods in terms of average run length (ARL) performance. The monitoring of linear profiles is also related to the control charting of regression-adjusted variables and other methods. / Ph. D.

Page generated in 0.0153 seconds