• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 1
  • 1
  • Tagged with
  • 13
  • 13
  • 13
  • 13
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Implementation of Anomaly Detection on a Time-series Temperature Data set

Novacic, Jelena, Tokhi, Kablai January 2019 (has links)
Aldrig har det varit lika aktuellt med hållbar teknologi som idag. Behovet av bättre miljöpåverkan inom alla områden har snabbt ökat och energikonsumtionen är ett av dem. En enkel lösning för automatisk kontroll av energikonsumtionen i smarta hem är genom mjukvara. Med dagens IoT teknologi och maskinlärningsmodeller utvecklas den mjukvarubaserade hållbara livsstilen allt mer. För att kontrollera ett hushålls energikonsumption måste plötsligt avvikande beteenden detekteras och regleras för att undvika onödig konsumption. Detta examensarbete använder en tidsserie av temperaturdata för att implementera detektering av anomalier. Fyra modeller implementerades och testades; en linjär regressionsmodell, Pandas EWM funktion, en EWMA modell och en PEWMA modell. Varje modell testades genom att använda dataset från nio olika lägenheter, från samma tidsperiod. Därefter bedömdes varje modell med avseende på Precision, Recall och F-measure, men även en ytterligare bedömning gjordes för linjär regression med R^2-score. Resultaten visar att baserat på noggrannheten hos varje modell överträffade PEWMA de övriga modellerna. EWMA modeller var något bättre än den linjära regressionsmodellen, följt av Pandas egna EWM modell. / Today's society has become more aware of its surroundings and the focus has shifted towards green technology. The need for better environmental impact in all areas is rapidly growing and energy consumption is one of them. A simple solution for automatically controlling the energy consumption of smart homes is through software. With today's IoT technology and machine learning models the movement towards software based ecoliving is growing. In order to control the energy consumption of a household, sudden abnormal behavior must be detected and adjusted to avoid unnecessary consumption. This thesis uses a time-series data set of temperature data for implementation of anomaly detection. Four models were implemented and tested; a Linear Regression model, Pandas EWM function, an exponentially weighted moving average (EWMA) model and finally a probabilistic exponentially weighted moving average (PEWMA) model. Each model was tested using data sets from nine different apartments, from the same time period. Then an evaluation of each model was conducted in terms of Precision, Recall and F-measure, as well as an additional evaluation for Linear Regression, using R^2 score. The results of this thesis show that in terms of accuracy, PEWMA outperformed the other models. The EWMA model was slightly better than the Linear Regression model, followed by the Pandas EWM model.
2

Comparing different exchange traded funds in South Africa based on volatility and returns / Wiehan Henri Peyper

Peyper, Wiehan Henri January 2014 (has links)
Increasing sophistication of exchange traded fund (ETF) indexation methods required that a comparison be drawn between various methodologies. A performance and risk evaluation of four pre-selected ETF indexation categories were conducted to establish the diversification benefits that each contain. Fundamentally weighted, equally weighted and leveraged ETFs were compared to traditional market capitalisation weighted ETFs on the basis of risk and return. While a literature review presented the theory on ETFs and the various statistical measures used for this study, the main findings were obtained empirically from a sample of South African and American ETFs. Several risk-adjusted performance measures were employed to assess the risk and return of each indexation category. Special emphasis was placed on the Omega ratio due to the unique interpretation of the return series‟ distribution characteristics. The risk of each ETF category was evaluated using the exponentially weighted moving average (EWMA), while the diversification potential was determined by means of a regression analysis based on the single index model. According to the findings, fundamentally weighted ETFs perform the best during an upward moving market when compared by standard risk-adjusted performance measures. However, the Omega ratio analysis revealed the inherent unsystematic risk of alternatively indexed ETFs and ranked market capitalisation weighted ETFs as the best performing category. Equal weighted ETFs delivered consistently poor rankings, while leveraged ETFs exhibited a high level of risk associated with the amplified returns of this category. The diversification measurement concurred with the Omega ratio analysis and highlighted the market capitalisation weighted ETFs to be the most diversified ETFs in the selection. Alternatively indexed ETFs consequently deliver higher absolute returns by incurring greater unsystematic risk, while simultaneously reducing the level of diversification in the fund. / MCom (Risk Management), North-West University, Vaal Triangle Campus, 2014
3

Comparing different exchange traded funds in South Africa based on volatility and returns / Wiehan Henri Peyper

Peyper, Wiehan Henri January 2014 (has links)
Increasing sophistication of exchange traded fund (ETF) indexation methods required that a comparison be drawn between various methodologies. A performance and risk evaluation of four pre-selected ETF indexation categories were conducted to establish the diversification benefits that each contain. Fundamentally weighted, equally weighted and leveraged ETFs were compared to traditional market capitalisation weighted ETFs on the basis of risk and return. While a literature review presented the theory on ETFs and the various statistical measures used for this study, the main findings were obtained empirically from a sample of South African and American ETFs. Several risk-adjusted performance measures were employed to assess the risk and return of each indexation category. Special emphasis was placed on the Omega ratio due to the unique interpretation of the return series‟ distribution characteristics. The risk of each ETF category was evaluated using the exponentially weighted moving average (EWMA), while the diversification potential was determined by means of a regression analysis based on the single index model. According to the findings, fundamentally weighted ETFs perform the best during an upward moving market when compared by standard risk-adjusted performance measures. However, the Omega ratio analysis revealed the inherent unsystematic risk of alternatively indexed ETFs and ranked market capitalisation weighted ETFs as the best performing category. Equal weighted ETFs delivered consistently poor rankings, while leveraged ETFs exhibited a high level of risk associated with the amplified returns of this category. The diversification measurement concurred with the Omega ratio analysis and highlighted the market capitalisation weighted ETFs to be the most diversified ETFs in the selection. Alternatively indexed ETFs consequently deliver higher absolute returns by incurring greater unsystematic risk, while simultaneously reducing the level of diversification in the fund. / MCom (Risk Management), North-West University, Vaal Triangle Campus, 2014
4

Monitoring High Quality Processes: A Study Of Estimation Errors On The Time-between-events Exponentially Weighted Moving Average Schemes

Ozsan, Guney 01 September 2008 (has links) (PDF)
In some production environments the defect rates are considerably low such that measurement of fraction of nonconforming items reaches parts per million level. In such environments, monitoring the number of conforming items between consecutive nonconforming items, namely the time between events (TBE) is often suggested. However, in the design of control charts for TBE monitoring a common practice is the assumptions of known process parameters. Nevertheless, in many applications the true values of the process parameters are not known. Their estimates should be determined from a sample obtained from the process at a time when it is expected to operate in a state of statistical control. Additional variability introduced through sampling may significantly effect the performance of a control chart. In this study, the effect of parameter estimation on the performance of Time Between Events Exponentially Weighted Moving Average (TBE EWMA) schemes is examined. Conditional performance is evaluated to show the effect of estimation. Marginal performance is analyzed in order to make recommendations on sample size requirements. Markov chain approach is used for evaluating the results.
5

Surveillance of Poisson and Multinomial Processes

Ryan, Anne Garrett 18 April 2011 (has links)
As time passes, change occurs. With this change comes the need for surveillance. One may be a technician on an assembly line and in need of a surveillance technique to monitor the number of defective components produced. On the other hand, one may be an administrator of a hospital in need of surveillance measures to monitor the number of patient falls in the hospital or to monitor surgical outcomes to detect changes in surgical failure rates. A natural choice for on-going surveillance is the control chart; however, the chart must be constructed in a way that accommodates the situation at hand. Two scenarios involving attribute control charting are investigated here. The first scenario involves Poisson count data where the area of opportunity changes. A modified exponentially weighted moving average (EWMA) chart is proposed to accommodate the varying sample sizes. The performance of this method is compared with the performance for several competing control chart techniques and recommendations are made regarding the best preforming control chart method. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). The second scenario involves monitoring a process where items are classified into more than two categories and the results for these classifications are readily available. A multinomial cumulative sum (CUSUM) chart is proposed to monitor these types of situations. The multinomial CUSUM chart is evaluated through comparisons of performance with competing control chart methods. This research is a result of joint work with Mr. Lee J. Wells (Grado Department of Industrial and Systems Engineering, Virginia Tech) and Dr. William H. Woodall (Department of Statistics, Virginia Tech). / Ph. D.
6

Adaptive Threshold Method for Monitoring Rates in Public Health Surveillance

Gan, Linmin 07 June 2010 (has links)
We examine some of the methodologies implemented by the Centers for Disease Control and Prevention's (CDC) BioSense program. The program uses data from hospitals and public health departments to detect outbreaks using the Early Aberration Reporting System (EARS). The EARS method W2 allows one to monitor syndrome counts (W2count) from each source and the proportion of counts of a particular syndrome relative to the total number of visits (W2rate). We investigate the performance of the W2r method designed using an empiric recurrence interval (RI) in this dissertation research. An adaptive threshold monitoring method is introduced based on fitting sample data to the underlying distributions, then converting the current value to a Z-score through a p-value. We compare the upper thresholds on the Z-scores required to obtain given values of the recurrence interval for different sets of parameter values. We then simulate one-week outbreaks in our data and calculate the proportion of times these methods correctly signal an outbreak using Shewhart and exponentially weighted moving average (EWMA) charts. Our results indicate the adaptive threshold method gives more consistent statistical performance across different parameter sets and amounts of baseline historical data used for computing the statistics. For the power analysis, the EWMA chart is superior to its Shewhart counterpart in nearly all cases, and the adaptive threshold method tends to outperform the W2 rate method. Two modified W2r methods proposed in the dissertation also tend to outperform the W2r method in terms of the RI threshold functions and in the power analysis. / Ph. D.
7

Controlling High Quality Manufacturing Processes: A Robustness Study Of The Lower-sided Tbe Ewma Procedure

Pehlivan, Canan 01 September 2008 (has links) (PDF)
In quality control applications, Time-Between-Events (TBE) type observations may be monitored by using Exponentially Weighted Moving Average (EWMA) control charts. A widely accepted model for the TBE processes is the exponential distribution, and hence TBE EWMA charts are designed under this assumption. Nevertheless, practical applications do not always conform to the theory and it is common that the observations do not fit the exponential model. Therefore, control charts that are robust to departures from the assumed distribution are desirable in practice. In this thesis, robustness of the lower-sided TBE EWMA charts to the assumption of exponentially distributed observations has been investigated. Weibull and lognormal distributions are considered in order to represent the departures from the assumed exponential model and Markov Chain approach is utilized for evaluating the performance of the chart. By analyzing the performance results, design settings are suggested in order to achieve robust lower-sided TBE EWMA charts.
8

Market Risk: Exponential Weightinh in the Value-at-Risk Calculation

Broll, Udo, Förster, Andreas, Siebe, Wilfried 03 September 2020 (has links)
When measuring market risk, credit institutions and Alternative Investment Fund Managers may deviate from equally weighting historical data in their Value-at-Risk calculation and instead use an exponential time series weighting. The use of expo-nential weighting in the Value-at-Risk calculation is very popular because it takes into account changes in market volatility (immediately) and can therefore quickly adapt to VaR. In less volatile market phases, this leads to a reduction in VaR and thus to lower own funds requirements for credit institutions. However, in the ex-ponential weighting a high volatility in the past is quickly forgotten and the VaR can be underestimated when using exponential weighting and the VaR may be un-derestimated. To prevent this, credit institutions or Alternative Investment Fund Managers are not completely free to choose a weighting (decay) factor. This article describes the legal requirements and deals with the calculation of the permissible weighting factor. As an example we use the exchange rate between Euro and Polish zloty to estimate the Value-at-Risk. We show the calculation of the weighting factor with two different approaches. This article also discusses exceptions to the general legal requirements.
9

Efficient Sampling Plans for Control Charts When Monitoring an Autocorrelated Process

Zhong, Xin 15 March 2006 (has links)
This dissertation investigates the effects of autocorrelation on the performances of various sampling plans for control charts in detecting special causes that may produce sustained or transient shifts in the process mean and/or variance. Observations from the process are modeled as a first-order autoregressive process plus a random error. Combinations of two Shewhart control charts and combinations of two exponentially weighted moving average (EWMA) control charts based on both the original observations and on the process residuals are considered. Three types of sampling plans are investigated: samples of n = 1, samples of n > 1 observations taken together at one sampling point, or samples of n > 1 observations taken at different times. In comparing these sampling plans it is assumed that the sampling rate in terms of the number of observations per unit time is fixed, so taking samples of n = 1 allows more frequent plotting. The best overall performance of sampling plans for control charts in detecting both sustained and transient shifts in the process is obtained by taking samples of n = 1 and using an EWMA chart combination with a observations chart for mean and a residuals chart for variance. The Shewhart chart combination with the best overall performance, though inferior to the EWMA chart combination, is based on samples of n > 1 taken at different times and with a observations chart for mean and a residuals chart for variance. / Ph. D.
10

Value at Risk: A Standard Tool in Measuring Risk : A Quantitative Study on Stock Portfolio

Ofe, Hosea, Okah, Peter January 2011 (has links)
The role of risk management has gained momentum in recent years most notably after the recent financial crisis. This thesis uses a quantitative approach to evaluate the theory of value at risk which is considered a benchmark to measure financial risk. The thesis makes use of both parametric and non parametric approaches to evaluate the effectiveness of VAR as a standard tool in measuring risk of stock portfolio. This study uses the normal distribution, student t-distribution, historical simulation and the exponential weighted moving average at 95% and 99% confidence levels on the stock returns of Sonny Ericsson, Three Months Swedish Treasury bill (STB3M) and Nordea Bank. The evaluations of the VAR models are based on the Kupiec (1995) Test. From a general perspective, the results of the study indicate that VAR as a proxy of risk measurement has some imprecision in its estimates. However, this imprecision is not all the same for all the approaches. The results indicate that models which assume normality of return distribution display poor performance at both confidence levels than models which assume fatter tails or have leptokurtic characteristics. Another finding from the study which may be interesting is the fact that during the period of high volatility such as the financial crisis of 2008, the imprecision of VAR estimates increases. For the parametric approaches, the t-distribution VAR estimates were accurate at 95% confidence level, while normal distribution approach produced inaccurate estimates at 95% confidence level. However both approaches were unable to provide accurate estimates at 99% confidence level. For the non parametric approaches the exponentially weighted moving average outperformed the historical simulation approach at 95% confidence level, while at the 99% confidence level both approaches tend to perform equally. The results of this study thus question the reliability on VAR as a standard tool in measuring risk on stock portfolio. It also suggest that more research should be done to improve on the accuracy of VAR approaches, given that the role of risk management in today’s business environment is increasing ever than before. The study suggest VAR should be complemented with other risk measures such as Extreme value theory and stress testing, and that more than one back testing techniques should be used to test the accuracy of VAR.

Page generated in 0.1392 seconds