• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 57
  • 22
  • 10
  • 8
  • 7
  • 6
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 139
  • 139
  • 38
  • 32
  • 29
  • 27
  • 25
  • 25
  • 20
  • 19
  • 19
  • 18
  • 17
  • 16
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Investiční modely v prostředí finančních trhů / The Investment Models in an Environment of Financial Markets

Bezděk, Petr January 2016 (has links)
The content of my master´s thesis is the creation of automatic trading system which will be applied on real trading account mainly on financial markets of currency pairs. The thesis is divided to several parts where the theoretical part will introduce the problem of trading on financial markets. Following part analyses needs of small trader on the financial markets, selecting suitable instruments which will be used in automatic trading system. The part of the own solution design will create the mentioned automatic trading system which will be applied on broker´s demo account where the system will be tested mainly on historical data. Based on test results, system will be optimized and in case of usable results of testing also system will be applied on real trading account in trading company.
92

Store attraction management : an application of the MCI model in Vietnam. / La gestion de l’attraction des points de vente : une application du modèle MCI au Vietnam

Pham, Ngoc Duc 13 October 2014 (has links)
Malgré que le Vietnam quitte le haut de 30 pays dans le monde en termes d’attraction, le marché de détail vietnamien est évalué à avoir un grand potentiel pour les détaillants étrangers à exploiter. Plusieurs détaillants vietnamiens reconnaissent cela comme une menace et font de leur mieux pour établir leurs marques dans le pays. Cependant, dans cette situation difficile, la formation et le développement des supermarchés locaux sont presque spontanés (Nhieu, 2006). Les responsables des supermarchés locaux n'étudient pas prudemment le comportement des consommateurs ainsi que la concurrence des supermarchés étrangers. Ces problèmes ci-dessus nous motivent à établir un système de soutien comme un outil de gestion de vente au détail pour les gestionnaires. Grâce à ce système de soutien, les gestionnaires de détail non seulement pénètrent facilement le comportement de leurs consommateurs en gérant l’attraction de magasin, mais aussi tiennent compte de la concurrence de leurs rivaux. Notre objectif principal est d'identifier les dimensions de l’attraction de magasin et de développer les modèles prédictifs pour expliquer l'importance des variables d’attribut afin de prédire l’attraction de magasin par la fréquentation. Bien que le choix de point de vente par les consommateurs ait fait l'objet de nombreuses recherches, encore qu'il semble y avoir un grand débat sur la signification et le sens des déterminants du choix de magasin. Nous avons cherché à enquêter sur cette question controversée dans notre thèse, dans le but de fournir une vue plus complète sur la fréquentation de détail. Le prochain objectif de notre recherche est d'étudier le comportement de choix de magasin des consommateurs et la politique commerciale des distributeurs au travers de la promotion. La théorie de la politique de promotion des détaillants nous conduit à une approche globale. Nous considérons l’ensemble des éléments de la politique commerciale des détaillants tels que la qualité du produit, le niveau des prix, etc, et bien évidemment la promotion. En outre, nous tenons compte de la concurrence des rivaux dans les zones géographiques que nous étudions. / Despite leaving the top of 30 countries in the world in terms of attractiveness, Vietnamese retail market is evaluated to have a great potential for the foreign retailers to exploit. Many Vietnamese retailers recognize this as a threat and are doing their best to establish their brands in the country. However, in this difficult situation, the formation and development of the local supermarkets are almost spontaneous (Nhieu, 2006). The managers of the local supermarkets do not study prudently the patronage behavior of the consumers as well as the competition of the foreign supermarkets. These above problems motivate us to establish a support system as a retailing management tool for the retail managers. Thanks to this support system, retail mangers not only penetrate easily their consumers’ behavior by management of store attraction but also take into account the competition of their rivals. The primary objective is to identify the dimensions of store attraction and to develop predictive models to explain the importance of store attributes variables in predicting store attraction through store patronage. Though the choice of retail outlet by consumers has been subject of a considerable amount of research, still there seems to be a huge debate on the significance and the direction of store choice determinants. We aimed to investigate this controversial issue in our thesis, for the purpose of providing a more comprehensive view on retail patronage. The next objective of our research is to study the store choice behavior of the consumers and the commercial policy of the retailers through the promotion. The theory of the promotional policy of the retailers leads us to a global approach. We consider total of the elements of the commercial policy of the retailers such as the product quality, the price level, etc. and obviously the promotion. In addition, we take into account the concurrence of the rivals within the geographical areas that we study.
93

Parabolische Randanfangswertaufgaben mit zufälliger Anfangs- und Randbedingung

Kandler, Anne 20 December 2006 (has links)
Die vorliegende Arbeit beschäftigt sich mit dem Problem der zufälligen Wärmeausbreitung in beschränkten Gebieten. Dieses Phänomen wird dabei durch eine lineare parabolische Randanfangswertaufabe beschrieben, wobei die Anfangsbedingung und die Neumannrandbedingung als zufällige Felder mit gegebener Wahrscheinlichkeitsverteilung angenommen werden. Des Weiteren werden die zufälligen Felder als homogen und epsilon-korreliert mit einer kleinen Korrelationslänge epsilon > 0 vorausgesetzt und sollen glatte Realisierungen besitzen. Zur Lösung der Randanfangswertaufgabe werden sowohl die klassische Formulierung als auch die Variationsformulierung herangezogen und in diesem Zusammenhang die Fourier Methode sowie die Finite-Elemente Methode betrachtet. Die Finite-Elemente Methode und die Fourier-Methode führen auf einen expliziten funktionalen Zusammenhang zwischen der zufälligen Lösung der betrachteten Randanfangswertaufgabe und den Einflussgrößen, so dass Momentenfunktionen davon abgeleitet werden können. Das Hauptinteresse dieser Arbeit liegt auf der Berechnung dieser Momentenfunktionen, welche durch die gewählten Eigenschaften der stochastischen Einflußgrößen bestimmt werden. Basierend auf dem Finite-Elemente Ansatz bzw. dem Fourier Ansatz werden verschiedene Approximationsmöglichkeiten insbesondere für die Korrelationsfunktion erörtert. Des Weiteren wird die Möglichkeit der Simulation des zufälligen Randanfangswertproblems betrachtet. Hierzu wird zur Simulation der zufälligen Einflussgrößen auf die Theorie von Moving Average Feldern zurückgegriffen. Der letzte Teil der Arbeit widmet sich dem Vergleich der erhaltenen analytischen Resultate anhand konkreter numerischer Beispiele.
94

Stochastische Charakteristiken von Lösungen parabolischer Randanfangswertprobleme mit zufälligen Koeffizienten

Hähnel, Holger 28 April 2010 (has links)
Im Mittelpunkt dieser Arbeit steht die Untersuchung des stochastischen Verhaltens von Lösungen parabolischer Randanfangswertprobleme mit zufälligen Koeffizienten. Aufgaben dieser Art entstehen beispielsweise bei der mathematischen Modellierung von Wärmeleitprozessen in Materialien, deren Wärmeleitfähigkeit als zufällige Größe bzw. als zufällige Funktion angesehen werden kann. Die Modellierung dieser stochastischen Einflüsse erfolgt u. a. mit Hilfe von epsilon-korrelierten Funktionen. Um stochastische Charakteristiken wie Erwartungswert-, Korrelations- und Varianzfunktion der Lösung des Randanfangswertproblems näherungsweise zu ermitteln, werden die Ansätze der Finite-Elemente-Methode (FEM), der Fouriermethode sowie der Stochastischen Simulation gewählt. Die beiden erstgenannten Verfahren erfahren eine Kombination mit der Methode der Störungsrechnung, wodurch sich jeweils Entwicklungen der gesuchten Charakteristiken bis zur zweiten Ordnung bezüglich eines Störungsparameters ergeben. Konkrete Ergebnisse werden für einfache ein- und zweidimensionale Gebiete ermittelt. Die Anwendung der Störungsrechnung wird im Fall der FEM zudem analytisch gerechtfertigt. Die Methode der Stochastischen Simulation nutzt die Approximation der eingehenden zufälligen Funktion durch Moving-Average-Felder. Für die Auswertung der auftretenden Integrale bei Anwendung der FEM werden explizite Formeln angegeben. Für einige Beispiele im ein- und zweidimensionalen Fall erfolgt die numerische Umsetzung sowie die grafische Präsentation der Ergebnisse sowie deren Vergleich für die verschiedenen eingesetzten Methoden. / This work focuses on the stochastic behavior of solutions of parabolic initial value problems with random coefficients. This sort of tasks is a result of modeling heat conduction processes on material whose heat conductivity can be considered as a random value or a random function. Stochastic influences are modeled, among others, by epsilon correlated functions. In order to determine stochastic characteristics like expectation value function, correlation function, and variance function of the problems solution approximately, the finite element method (FEM), the Fourier method, and the Monte Carlo Simulation are chosen. The first two methods are combined with perturbation techniques. This leads to expansions of the characteristics up to the second order with respect to a perturbation parameter. Results are determined for cases of one and two dimensional domains. The applicability of perturbation methods is verified for the FEM-based solution. The Monte Carlo Simulation uses the approximation of random functions by moving average fields. Explicit formulas are given for the evaluation of integrals which appear by applying the FEM. The work ends with the presenting of numerical examples for the one and two dimensional case.
95

Virtual Power Plant Simulation and Control Scheme Design

Chen, Zhenwei January 2012 (has links)
Virtual Power Plant (VPP) is a concept that aggregate Distributed Energy Resources (DER) together, aims to overcome the capacity limits of single DER and the intermit-ted natural characteristics of renewable energy sources like wind and solar. The whole system can be viewed as a single large-capacity power plant from the system‘s point of view. In this project, the literature review of VPP concept, architecture, existed project and the survey of VPP in Sweden are being conducted first. Secondly, the simplified VPP model is built on MATLAB/Simulink software. The simplified system contains a wind farm, a hydro power plant, a dynamic system load and an infinite bus representing the large transmission grid. During the simulation process, the generation and consump-tion unites are running according to the real history data located in external database. In the third place, optimized control schemes for the hydro unit in VPP model to decrease its effects on transmission grid are implemented in Simulink model. At the same time, hydro turbine should be controlled in an optimized way that without large turbulence. Basically, the hydro power plant is responsible for balancing the active power between the wind farm and dynamic load. Since there is a limit for the hydro turbine output, the rest of either power shortage or surplus power need to be com-pensated by the grid. This is the fundamental control scheme, so called run time con-trol scheme. The advanced control schemes here are based on the moving average control method and forecast compensation control method. The forecast compensa-tion control method use the 24 hours ahead load forecasting data generated by Artifi-cial Neural Network. Later on, analysis of those three control schemes will be pre-sented. The last part of the project is the conclusion of the different control schemes according to comparison of their control results.
96

Univariate parametric and nonparametric double generally weighted moving average control charts

Masoumi Karakani, Hossein January 2020 (has links)
Statistical process control (SPC) is a collection of scientific tools developed and engineered to diagnose unnecessary variation in the output of a production process and eliminate it or perhaps accommodate it by adjusting process settings. The task of quality control (QC) is of fundamental importance in manufacturing processes when a change in the process causes misleading results, this alteration should be detected and corrected as soon as possible. Statistical QC charts originated in the late 1920s by Dr. W. A. Shewhart provide a powerful tool for monitoring production lines in manufacturing industries. They are also have been implemented in various disciplines, such as sequential monitoring of internet traffic flows, health care systems, and more. Shewhart-type charts are effective in detecting large shifts in the process but ineffective in detecting small to moderate shifts. This blind spot allows small shifts (smaller than one standard deviation) to continue undetected in the process, thereby incurring larger total costs for manufacturers. This thesis addresses this issue by augmenting current time-weighted charts (charts that use all the information from the start of a process until the most recent sample/observation) with a Double Generally Weighted Moving Average (DGWMA) chart, leading to more effective process monitoring. The objective of this thesis is to provide the fundamentals and introduce the researcher/practitioner to the essentials of the univariate DGWMA chart from both parametric and nonparametric perspectives. Numerous concepts and characteristics of proposed DGWMA charts are discussed comprehensively. Theoretical expressions and detailed calculations have been provided to aid the interested reader to familiarize and study the topic more thoroughly. This thesis paints a bigger picture of the DGWMA chart in a sense that other time-weighted charts such as the Generally Weighted Moving Average (GWMA), Exponentially Weighted Moving Average (EWMA), Double Exponentially Weighted Moving Average (DEWMA) and Cumulative Sum (CUSUM) fall under this umbrella. Both real-life data and simulated examples have been embedded throughout the thesis. We make use of R and Mathematica software packages to calculate numerical results related to the run length distribution and its associated characteristics in this thesis. We only consider control charts for monitoring the process location parameter. However, our conclusions and recommendations are extendable for the process dispersion parameter. In this thesis, we consider the DGWMA chart as the main chart and the EWMA, DEWMA, GWMA, and CUSUM charts as special cases. The thesis consists of the following chapters with a short description for each chapter as follows: Chapter 1 provides a brief introduction to SPC concepts and gives a literature review in terms of background information for the research conducted in this thesis. The scope and objectives of the present research are highlighted in detail. Chapter 2 provides an overview and a theoretical background on the design and implementation of the DGWMA chart derived from the SPC literature review. The properties of the DGWMA chart, including the plotting statistic, the structure for the weights, the control limits (exact/steady-state), etc. are considered in detail. The weighting structure of the DGWMA chart and its special case are discussed and pictured to emphasize the impact of weights in increasing the detection capability of time-weighted charts. Three approaches are described and investigated for calculating the run length distribution and its associated characteristics for the DGWMA chart and its special case the DEWMA chart; this includes: (i) exact approach; (ii) Markov chain approach; (iii) Monte Carlo simulation. In Chapter 3 we develop a one-sided generalized parametric chart (denoted by DGWMA-TBE) for monitoring the time between events (TBE) of nonconformities items originating from the high-yield processes when the underlying process distribution is gamma and the parameter of interest is known (Case K) and unknown (Case U). A Markov chain approach is implemented to derive the run length distribution and its associated characteristics for the DGWMA and DEWMA charts. An exact approach is also used to derive closed-form expressions for the run length distribution of the proposed chart. Performance analysis has been undertaken to execute a comparative study with several existing time-weighted charts. The proposed chart encompasses one-sided GWMA-TBE, EWMA-TBE, DEWMA-TBE, and Shewhart-type charts as limiting or special cases. The CUSUM-TBE chart is also included in the performance comparison. The necessary design parameters are provided to aid the implementation of the proposed chart and finding the optimal design and near optima design that is useful for practitioners. Alternative discrete distributions are considered for the weights of the GWMA-TBE chart and a discussion is provided to address the connection between new weights originating from the suggested distributions and the chart’s capability in detecting shifts. As a result, one can design an optimal GWMA-TBE chart by replacing weights from the discrete Weibull distribution without the implementation of the double exponential smoothing technique. Chapter 4 focuses on developing a two-sided nonparametric (distribution-free) DGWMA control chart based on the exceedance (EX) statistic, denoted as DGWMA-EX when the parameter of interest is unknown (Case U) and the underlying process distribution is continuous and symmetric. An exact approach and a Markov chain approach are considered to calculate the run length distribution and its associated characteristics for the proposed chart. A performance comparison has been undertaken to execute analysis with other nonparametric time-weighted charts available in the SPC literature. The proposed chart en-compasses two-sided GWMA-EX, EWMA-EX, DEWMA-EX, and Shewhart-type charts as limiting or special cases. The CUSUM-EX chart is also included in the performance comparison Also, the performance of the proposed DGWMA-EX chart has been evaluated under different symmetric and skewed distributions in comparison with its main counterparts, and the necessary results and recommendations are provided for practitioners to design an optimal chart. Chapter 5 encloses this thesis with a summary of the research conducted and provides concluding remarks concerning future research opportunities. / Thesis (PhD (Mathematical Statistics))--University of Pretoria, 2020. / This research was supported in part by the National Research Foundation (NRF) under Grant Number 71199 and the postgraduate research bursary supported by the University of Pretoria. Any findings, opinions, and conclusions expressed in this thesis are those of the author and do not necessarily reflect the views of the parties. / Statistics / PhD (Mathematical Statistics) / Unrestricted
97

[en] MOVING AVERAGE REVERSION IN THE BRAZILIAN STOCK MARKET: A TECHNICAL ANALYSIS APPROACH UNDER THE OPTICS OF BEHAVIORAL FINANCE / [pt] REVERSÃO À MÉDIA MÓVEL DE CURTÍSSIMO PRAZO NO MERCADO ACIONÁRIO BRASILEIRO: ABORDAGEM DA ANÁLISE TÉCNICA SOB A ÓTICA DAS FINANÇAS COMPORTAMENTAIS

THIAGO JOSE STRECK DEL GRANDE 08 September 2016 (has links)
[pt] Esta dissertação tem por objetivo investigar a possibilidade de obtenção de retornos anormais – utilizando-se o período entre jan/2005 e dez/2014 como espaço amostral – no mercado acionário brasileiro. Investigou-se, então, a hipótese de reversão à média móvel de 21 dias para os ativos integrantes do Índice Brasil 100 – IBrX-100. Estratégias contrárias com carteiras compradas em ações cujos preços estivessem abaixo da média móvel e vendidas em ações cujos preços estivessem acima da média móvel foram montadas e testadas para os referidos períodos. Por fim, não foram encontradas evidências em favor da reversão à média móvel de 21 dias para o período estudado. / [en] The goal of this study is to investigate the possibility of obtaining abnormal returns – using the period between January/2005 and December/2014 –in the Brazilian stock market. The main hypothesis in focus is the moving average of 21 days reversion of the securities of the Index Brasil 100 – IBrX 100. Contrarian strategies were used with portfolios built by buying stocks whose prices were below the moving average and selling stocks whose prices are above the moving average. There is no evidence in favor of the reversion and in favor of the possibility of abnormal returns in the study period.
98

Portfolio Performance Optimization Using Multivariate Time Series Volatilities Processed With Deep Layering LSTM Neurons and Markowitz / Portföljprestanda optimering genom multivariata tidsseriers volatiliteter processade genom lager av LSTM neuroner och Markowitz

Andersson, Aron, Mirkhani, Shabnam January 2020 (has links)
The stock market is a non-linear field, but many of the best-known portfolio optimization algorithms are based on linear models. In recent years, the rapid development of machine learning has produced flexible models capable of complex pattern recognition. In this paper, we propose two different methods of portfolio optimization; one based on the development of a multivariate time-dependent neural network,thelongshort-termmemory(LSTM),capable of finding lon gshort-term price trends. The other is the linear Markowitz model, where we add an exponential moving average to the input price data to capture underlying trends. The input data to our neural network are daily prices, volumes and market indicators such as the volatility index (VIX).The output variables are the prices predicted for each asset the following day, which are then further processed to produce metrics such as expected returns, volatilities and prediction error to design a portfolio allocation that optimizes a custom utility function like the Sharpe Ratio. The LSTM model produced a portfolio with a return and risk that was close to the actual market conditions for the date in question, but with a high error value, indicating that our LSTM model is insufficient as a sole forecasting tool. However,the ability to predict upward and downward trends was somewhat better than expected and therefore we conclude that multiple neural network can be used as indicators, each responsible for some specific aspect of what is to be analysed, to draw a conclusion from the result. The findings also suggest that the input data should be more thoroughly considered, as the prediction accuracy is enhanced by the choice of variables and the external information used for training. / Aktiemarknaden är en icke-linjär marknad, men många av de mest kända portföljoptimerings algoritmerna är baserad på linjära modeller. Under de senaste åren har den snabba utvecklingen inom maskininlärning skapat flexibla modeller som kan extrahera information ur komplexa mönster. I det här examensarbetet föreslår vi två sätt att optimera en portfölj, ett där ett neuralt nätverk utvecklas med avseende på multivariata tidsserier och ett annat där vi använder den linjära Markowitz modellen, där vi även lägger ett exponentiellt rörligt medelvärde på prisdatan. Ingångsdatan till vårt neurala nätverk är de dagliga slutpriserna, volymerna och marknadsindikatorer som t.ex. volatilitetsindexet VIX. Utgångsvariablerna kommer vara de predikterade priserna för nästa dag, som sedan bearbetas ytterligare för att producera mätvärden såsom förväntad avkastning, volatilitet och Sharpe ratio. LSTM-modellen producerar en portfölj med avkastning och risk som ligger närmre de verkliga marknadsförhållandena, men däremot gav resultatet ett högt felvärde och det visar att vår LSTM-modell är otillräckligt för att använda som ensamt predikteringssverktyg. Med det sagt så gav det ändå en bättre prediktion när det gäller trender än vad vi antog den skulle göra. Vår slutsats är därför att man bör använda flera neurala nätverk som indikatorer, där var och en är ansvarig för någon specifikt aspekt man vill analysera, och baserat på dessa dra en slutsats. Vårt resultat tyder också på att inmatningsdatan bör övervägas mera noggrant, eftersom predikteringsnoggrannheten.
99

Efficient In-Database Maintenance of ARIMA Models

Rosenthal, Frank, Lehner, Wolfgang 25 January 2023 (has links)
Forecasting is an important analysis task and there is a need of integrating time series models and estimation methods in database systems. The main issue is the computationally expensive maintenance of model parameters when new data is inserted. In this paper, we examine how an important class of time series models, the AutoRegressive Integrated Moving Average (ARIMA) models, can be maintained with respect to inserts. Therefore, we propose a novel approach, on-demand estimation, for the efficient maintenance of maximum likelihood estimates from numerically implemented estimators. We present an extensive experimental evaluation on both real and synthetic data, which shows that our approach yields a substantial speedup while sacrificing only a limited amount of predictive accuracy.
100

Multivariate EWMA Control Chart and Application to a Semiconductor Manufacturing Process

Huh, Ick 09 1900 (has links)
<p>The multivariate cumulative sum (MCUSUM) and the multivariate exponentially weighted moving average (MEWMA) control charts are the two leading methods to monitor a multivariate process. This thesis focuses on the MEWMA control chart. Specifically, using the Markov chain method, we study in detail several aspects of the run length distribution both for the on- and off- target cases. Regarding the on-target run length analysis, we express the probability mass function of the run length distribution, the average run length (ARL), the variance of run length (V RL) and higher moments of the run length distribution in mathematically closed forms. In previous studies, with respect to the off-target performance for the MEWMA control chart, the process mean shift was usually assumed to take place at the beginning of the process. We extend the classical off-target case and introduce a generalization of the probability mass function of the run length distribution, the ARL and the V RL. What Prabhu and Runger (1996) proposed can be derived from our new model. By evaluating the off-target ARL values for the MEWMA control chart, we determine the optimal smoothing parameters by using the partition method that provides an easy algorithm to find the optimal smoothing parameters and study how they respond as the process mean shift time changes. We compare the ARL performance of the MEWMA control chart with that of the multivariate Shewhart control chart to see whether the MEWMA chart is still effective in detecting a small mean shift as the process mean shift time changes. In order to apply the model to semiconductor manufacturing processes, we use a bivariate normal distribution to generate sample data and compare the MEWMA control chart with the multivariate Shewhart control chart to evaluate how the MEWMA control chart behaves when a delayed mean shift happens. We also apply the variation transmission model introduced by Lawless et al. (1999) to the semiconductor manufacturing process and show an extension of the model to make our application to semiconductor manufacturing processes more realistic. All the programming and calculations were done in R</p> / Master of Science (MS)

Page generated in 0.115 seconds