• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 22
  • 15
  • 13
  • 13
  • 5
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Exponentially Accurate Error Estimates of Quasiclassical Eigenvalues

Toloza, Julio Hugo 16 December 2002 (has links)
We study the behavior of truncated Rayleigh-Schröodinger series for the low-lying eigenvalues of the time-independent Schröodinger equation, when the Planck's constant is considered in the semiclassical limit. Under certain hypotheses on the potential energy, we prove that, for any given small value of the Planck's constant, there is an optimal truncation of the series for the approximate eigenvalues, such that the difference between an approximate and actual eigenvalue is smaller than an exponentially small function of the Planck's constant. We also prove the analogous results concerning the eigenfunctions. / Ph. D.
2

On Exponentially Perfect Numbers Relatively Prime to 15

Kolenick, Joseph F., Jr. 03 December 2007 (has links)
No description available.
3

Implementation of Anomaly Detection on a Time-series Temperature Data set

Novacic, Jelena, Tokhi, Kablai January 2019 (has links)
Aldrig har det varit lika aktuellt med hållbar teknologi som idag. Behovet av bättre miljöpåverkan inom alla områden har snabbt ökat och energikonsumtionen är ett av dem. En enkel lösning för automatisk kontroll av energikonsumtionen i smarta hem är genom mjukvara. Med dagens IoT teknologi och maskinlärningsmodeller utvecklas den mjukvarubaserade hållbara livsstilen allt mer. För att kontrollera ett hushålls energikonsumption måste plötsligt avvikande beteenden detekteras och regleras för att undvika onödig konsumption. Detta examensarbete använder en tidsserie av temperaturdata för att implementera detektering av anomalier. Fyra modeller implementerades och testades; en linjär regressionsmodell, Pandas EWM funktion, en EWMA modell och en PEWMA modell. Varje modell testades genom att använda dataset från nio olika lägenheter, från samma tidsperiod. Därefter bedömdes varje modell med avseende på Precision, Recall och F-measure, men även en ytterligare bedömning gjordes för linjär regression med R^2-score. Resultaten visar att baserat på noggrannheten hos varje modell överträffade PEWMA de övriga modellerna. EWMA modeller var något bättre än den linjära regressionsmodellen, följt av Pandas egna EWM modell. / Today's society has become more aware of its surroundings and the focus has shifted towards green technology. The need for better environmental impact in all areas is rapidly growing and energy consumption is one of them. A simple solution for automatically controlling the energy consumption of smart homes is through software. With today's IoT technology and machine learning models the movement towards software based ecoliving is growing. In order to control the energy consumption of a household, sudden abnormal behavior must be detected and adjusted to avoid unnecessary consumption. This thesis uses a time-series data set of temperature data for implementation of anomaly detection. Four models were implemented and tested; a Linear Regression model, Pandas EWM function, an exponentially weighted moving average (EWMA) model and finally a probabilistic exponentially weighted moving average (PEWMA) model. Each model was tested using data sets from nine different apartments, from the same time period. Then an evaluation of each model was conducted in terms of Precision, Recall and F-measure, as well as an additional evaluation for Linear Regression, using R^2 score. The results of this thesis show that in terms of accuracy, PEWMA outperformed the other models. The EWMA model was slightly better than the Linear Regression model, followed by the Pandas EWM model.
4

Comparing different exchange traded funds in South Africa based on volatility and returns / Wiehan Henri Peyper

Peyper, Wiehan Henri January 2014 (has links)
Increasing sophistication of exchange traded fund (ETF) indexation methods required that a comparison be drawn between various methodologies. A performance and risk evaluation of four pre-selected ETF indexation categories were conducted to establish the diversification benefits that each contain. Fundamentally weighted, equally weighted and leveraged ETFs were compared to traditional market capitalisation weighted ETFs on the basis of risk and return. While a literature review presented the theory on ETFs and the various statistical measures used for this study, the main findings were obtained empirically from a sample of South African and American ETFs. Several risk-adjusted performance measures were employed to assess the risk and return of each indexation category. Special emphasis was placed on the Omega ratio due to the unique interpretation of the return series‟ distribution characteristics. The risk of each ETF category was evaluated using the exponentially weighted moving average (EWMA), while the diversification potential was determined by means of a regression analysis based on the single index model. According to the findings, fundamentally weighted ETFs perform the best during an upward moving market when compared by standard risk-adjusted performance measures. However, the Omega ratio analysis revealed the inherent unsystematic risk of alternatively indexed ETFs and ranked market capitalisation weighted ETFs as the best performing category. Equal weighted ETFs delivered consistently poor rankings, while leveraged ETFs exhibited a high level of risk associated with the amplified returns of this category. The diversification measurement concurred with the Omega ratio analysis and highlighted the market capitalisation weighted ETFs to be the most diversified ETFs in the selection. Alternatively indexed ETFs consequently deliver higher absolute returns by incurring greater unsystematic risk, while simultaneously reducing the level of diversification in the fund. / MCom (Risk Management), North-West University, Vaal Triangle Campus, 2014
5

Comparing different exchange traded funds in South Africa based on volatility and returns / Wiehan Henri Peyper

Peyper, Wiehan Henri January 2014 (has links)
Increasing sophistication of exchange traded fund (ETF) indexation methods required that a comparison be drawn between various methodologies. A performance and risk evaluation of four pre-selected ETF indexation categories were conducted to establish the diversification benefits that each contain. Fundamentally weighted, equally weighted and leveraged ETFs were compared to traditional market capitalisation weighted ETFs on the basis of risk and return. While a literature review presented the theory on ETFs and the various statistical measures used for this study, the main findings were obtained empirically from a sample of South African and American ETFs. Several risk-adjusted performance measures were employed to assess the risk and return of each indexation category. Special emphasis was placed on the Omega ratio due to the unique interpretation of the return series‟ distribution characteristics. The risk of each ETF category was evaluated using the exponentially weighted moving average (EWMA), while the diversification potential was determined by means of a regression analysis based on the single index model. According to the findings, fundamentally weighted ETFs perform the best during an upward moving market when compared by standard risk-adjusted performance measures. However, the Omega ratio analysis revealed the inherent unsystematic risk of alternatively indexed ETFs and ranked market capitalisation weighted ETFs as the best performing category. Equal weighted ETFs delivered consistently poor rankings, while leveraged ETFs exhibited a high level of risk associated with the amplified returns of this category. The diversification measurement concurred with the Omega ratio analysis and highlighted the market capitalisation weighted ETFs to be the most diversified ETFs in the selection. Alternatively indexed ETFs consequently deliver higher absolute returns by incurring greater unsystematic risk, while simultaneously reducing the level of diversification in the fund. / MCom (Risk Management), North-West University, Vaal Triangle Campus, 2014
6

Separatrix splitting for the extended standard family of maps

Wronka, Agata Ewa January 2011 (has links)
This thesis presents two dimensional discrete dynamical system, the extended standard family of maps, which approximates homoclinic bifurcations of continuous dissipative systems. The main subject of study is the problem of separatrix splitting which was first discovered by Poincaré in the context of the n-body problem. Separatrix splitting leads to chaotic behaviour of the system on exponentially small region in parameter space. To estimate the size of the region the dissipative map is extended to complex variables and approximated by differential equation on a specific domain. This approach was proposed by Lazutkin to study separatrix splitting for Chirikov’s standard map. Furthermore the complex nearly periodic function is used to estimate the width of the exponentially small region where chaos prevails and the map is related to the semistandard map. Numerical computations require solving complex differential equation and provide the constants involved in the asymptotic formula for the size of the region. Another problem studied in this thesis is the prevalence of resonance for the dissipative standard map on a specific invariant set, which for one dimensional map corresponds to a circle. The regions in parameter space where periodic behaviour occurs on the invariant set is known as Arnold tongues. The width of Arnold tongue is studied and numerical results obtained by iterating the map and solving differential equation are related to the semistandard map.
7

Monitoring High Quality Processes: A Study Of Estimation Errors On The Time-between-events Exponentially Weighted Moving Average Schemes

Ozsan, Guney 01 September 2008 (has links) (PDF)
In some production environments the defect rates are considerably low such that measurement of fraction of nonconforming items reaches parts per million level. In such environments, monitoring the number of conforming items between consecutive nonconforming items, namely the time between events (TBE) is often suggested. However, in the design of control charts for TBE monitoring a common practice is the assumptions of known process parameters. Nevertheless, in many applications the true values of the process parameters are not known. Their estimates should be determined from a sample obtained from the process at a time when it is expected to operate in a state of statistical control. Additional variability introduced through sampling may significantly effect the performance of a control chart. In this study, the effect of parameter estimation on the performance of Time Between Events Exponentially Weighted Moving Average (TBE EWMA) schemes is examined. Conditional performance is evaluated to show the effect of estimation. Marginal performance is analyzed in order to make recommendations on sample size requirements. Markov chain approach is used for evaluating the results.
8

Surveillance of Poisson and Multinomial Processes

Ryan, Anne Garrett 18 April 2011 (has links)
As time passes, change occurs. With this change comes the need for surveillance. One may be a technician on an assembly line and in need of a surveillance technique to monitor the number of defective components produced. On the other hand, one may be an administrator of a hospital in need of surveillance measures to monitor the number of patient falls in the hospital or to monitor surgical outcomes to detect changes in surgical failure rates. A natural choice for on-going surveillance is the control chart; however, the chart must be constructed in a way that accommodates the situation at hand. Two scenarios involving attribute control charting are investigated here. The first scenario involves Poisson count data where the area of opportunity changes. A modified exponentially weighted moving average (EWMA) chart is proposed to accommodate the varying sample sizes. The performance of this method is compared with the performance for several competing control chart techniques and recommendations are made regarding the best preforming control chart method. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). The second scenario involves monitoring a process where items are classified into more than two categories and the results for these classifications are readily available. A multinomial cumulative sum (CUSUM) chart is proposed to monitor these types of situations. The multinomial CUSUM chart is evaluated through comparisons of performance with competing control chart methods. This research is a result of joint work with Mr. Lee J. Wells (Grado Department of Industrial and Systems Engineering, Virginia Tech) and Dr. William H. Woodall (Department of Statistics, Virginia Tech). / Ph. D.
9

Adaptive Threshold Method for Monitoring Rates in Public Health Surveillance

Gan, Linmin 07 June 2010 (has links)
We examine some of the methodologies implemented by the Centers for Disease Control and Prevention's (CDC) BioSense program. The program uses data from hospitals and public health departments to detect outbreaks using the Early Aberration Reporting System (EARS). The EARS method W2 allows one to monitor syndrome counts (W2count) from each source and the proportion of counts of a particular syndrome relative to the total number of visits (W2rate). We investigate the performance of the W2r method designed using an empiric recurrence interval (RI) in this dissertation research. An adaptive threshold monitoring method is introduced based on fitting sample data to the underlying distributions, then converting the current value to a Z-score through a p-value. We compare the upper thresholds on the Z-scores required to obtain given values of the recurrence interval for different sets of parameter values. We then simulate one-week outbreaks in our data and calculate the proportion of times these methods correctly signal an outbreak using Shewhart and exponentially weighted moving average (EWMA) charts. Our results indicate the adaptive threshold method gives more consistent statistical performance across different parameter sets and amounts of baseline historical data used for computing the statistics. For the power analysis, the EWMA chart is superior to its Shewhart counterpart in nearly all cases, and the adaptive threshold method tends to outperform the W2 rate method. Two modified W2r methods proposed in the dissertation also tend to outperform the W2r method in terms of the RI threshold functions and in the power analysis. / Ph. D.
10

Algorithmic Trading : Hidden Markov Models on Foreign Exchange Data

Idvall, Patrik, Jonsson, Conny January 2008 (has links)
In this master's thesis, hidden Markov models (HMM) are evaluated as a tool for forecasting movements in a currency cross. With an ever increasing electronic market, making way for more automated trading, or so called algorithmic trading, there is constantly a need for new trading strategies trying to find alpha, the excess return, in the market. HMMs are based on the well-known theories of Markov chains, but where the states are assumed hidden, governing some observable output. HMMs have mainly been used for speech recognition and communication systems, but have lately also been utilized on financial time series with encouraging results. Both discrete and continuous versions of the model will be tested, as well as single- and multivariate input data. In addition to the basic framework, two extensions are implemented in the belief that they will further improve the prediction capabilities of the HMM. The first is a Gaussian mixture model (GMM), where one for each state assign a set of single Gaussians that are weighted together to replicate the density function of the stochastic process. This opens up for modeling non-normal distributions, which is often assumed for foreign exchange data. The second is an exponentially weighted expectation maximization (EWEM) algorithm, which takes time attenuation in consideration when re-estimating the parameters of the model. This allows for keeping old trends in mind while more recent patterns at the same time are given more attention. Empirical results shows that the HMM using continuous emission probabilities can, for some model settings, generate acceptable returns with Sharpe ratios well over one, whilst the discrete in general performs poorly. The GMM therefore seems to be an highly needed complement to the HMM for functionality. The EWEM however does not improve results as one might have expected. Our general impression is that the predictor using HMMs that we have developed and tested is too unstable to be taken in as a trading tool on foreign exchange data, with too many factors influencing the results. More research and development is called for.

Page generated in 0.0695 seconds