Spelling suggestions: "subject:"movingaverage"" "subject:"movingaverages""
51 |
Η προσέγγιση της τεχνικής ανάλυσης στη σύγχρονη χρηματοοικονομική. Μία εφαρμογή στο Χρηματιστήριο Αθηνών / The technical analysis approach in modern finance. An application to Athens Stock ExchangeΜενύχτα, Ολυμπία 25 January 2012 (has links)
Στην παρούσα εργασία ερευνάται η κερδοφορία του τεχνικού κανόνα του κινητού μέσου όρου στο χρηματιστήριο Αθηνών. Συγκεκριμένα, χρησιμοποιούνται ημερήσια δεδομένα του δείκτη FTSE 20 για την περίοδο 2005 έως 2011.Σύμφωνα με τα κύρια αποτελέσματα της έρευνας, οι αποδόσεις αγοράς των μετοχών δεν διαφέρουν από τις αποδόσεις πώλησης των μετοχών. Συνεπώς, με την υιοθέτηση του συγκεκριμένου τεχνικού κανόνα δεν είναι εφικτή η παραγωγή υπερβάλλοντων κερδών. Ωστόσο, παρατηρείται ότι η αγορά είναι καθοδική από το 2008 και μετά, με αποτέλεσμα σε πολλές περιπτώσεις να παρουσιάζεται σημαντική διαφορά μεταξύ των διακυμάνσεων των αποδόσεων αγοράς και των αποδόσεων πώλησης των μετοχών. Όπως είναι φυσικό, κάτι τέτοιο δεν έχει καμία επίδραση στο αποτέλεσμα της μη κερδοφορίας. Επίσης, τα αποτελέσματα της έρευνας δείχνουν ότι οι πιθανότητες κέρδους μέσω της τεχνικής ανάλυσης ελαχιστοποιούνται έως και μηδενίζονται μακροπρόθεσμα. / In this work, the profitability of the moving average MA trading rule is examined in the Athens Stock Exchange. Particularly, dialy data of stock index FTSE 20 are used, over 2005-2011 periods. According to the main results, stocks' buy returns do not differ from stocks' sell returns. Consequently, generation of excess profits is not feasible with the adoption of this technical rule. It is, however, observed that the market has been in recession since 2008, so that there seems to be signigicant difference between the variance of buy returns and the variance of sell returns. It is obvious that, this has no effect on the main results. Also, results show that the probabilities of profitability using technical analysis are minimized or even become zero in the long run.
|
52 |
Simulações de pesos espaciais para o modelo STARMA e aplicações / Simulations of spatial weights for STARMA model and applicationsGuilherme Biz 01 August 2014 (has links)
A modelagem de processos espaço-temporais é de suma importância para dados climatológicos, visto que o clima sofre influência temporal e espacial. A classe de modelos STARMA, autorregressivo e de médias móveis espaço-temporal, adequa-se a esses processos, porém, não há, na literatura, um estudo sobre o melhor método para quantificar a dependência espacial, e não é sabido se há uma diferença entre os métodos para esses modelos. Logo, neste trabalho, é realizado um estudo de simulações do modelo STAR, utilizando-se diferentes formas para obter os pesos espaciais. Após concluir as simulações é realizado o ajuste de um modelo STARIMA para um conjunto de dados de médias mensais de temperaturas mínimas diárias coletadas em uma mesorregião localizada no Oeste do Estado do Paraná. Este trabalho é separado em dois artigos e ambos são realizados utilizando-se o programa R. O primeiro é o estudo de simulações, chegando-se à conclusão de que o método para determinar a dependência espacial interfere no resultado da modelagem e depende da região em estudo. No segundo artigo, conclui-se que o inverso da distância é a melhor opção para a matriz de pesos e um modelo STARIMA sazonal tem o melhor ajuste para o conjunto de dados em questão. / Process modeling spatio-temporal is of great importance for climatological data, once that the climate undergoes spatial and temporal influence. The class of models STARMA, autoregressive models and spatio-temporal moving averages, are suitable to the these processes, however, for these models, there is not a study about the best method to quantify the spatial dependence, and/or it is not known whether there is a difference between the methods for these models. In this thesis, a study simulations of the STAR model using different forms for the spatial weights is performed. After the simulation procedure, the STARIMA model is fitted to the real dataset of monthly mean daily minimum temperatures collected in a mesoregion located to the west of the state of Paraná. This thesis is separated into two papers and both are performed using the statistical software R. The first one is the simulation study that concludes that the method for determining the spatial dependence interferes with results of the modeling and depends on the region under study. In the second paper, it is concluded that the inverse distance is the best option for the weight matrix and a seasonal STARIMA model has the best fit for the data set.
|
53 |
Fundamentální a technická analýza akcie Telefonica 02 Czech Republic, a. s. / Fundamental and technical analysis of Telefonica O2 Czech Republic, a.s. shareKálal, Tomáš January 2009 (has links)
First part of this graduation theses "Fundamental and technical analysis of the Telefonica O2 Czech Republic, a.s. equity" concern more about the teoretical approach of the characteristics of the company Telefonica O2, his competitors on the country level as well as on the regional level. This description should prepare the reader to know better the telecomunication sector. The second part is a empirical study. Primarily from the fundamental approach and then from the technical one. These two parts concern about discovering the "buy" os "sell" recommendation for a real investor. Each of the methods are first described and then a brief comment of the results is made.
|
54 |
Technická analýza / Technical AnalysisZáděra, David January 2013 (has links)
This thesis deals with trading using technical analysis. Mostly attention is paid shares traded on the Prague Stock Exchange. The practical part describes computer program, which gives recommendations for the purchase and sale of shares based on moving averages and methods of moving average convergence / divergence and relative strength index. The conclusion is stated financial comparison of the methods.
|
55 |
Moving-Average approximations of random epsilon-correlated processesKandler, Anne, Richter, Matthias, vom Scheidt, Jürgen, Starkloff, Hans-Jörg, Wunderlich, Ralf 31 August 2004 (has links)
The paper considers approximations of time-continuous epsilon-correlated random
processes by interpolation of time-discrete Moving-Average processes. These approximations
are helpful for Monte-Carlo simulations of the response of systems
containing random parameters described by
epsilon-correlated processes. The paper focuses
on the approximation of stationary
epsilon-correlated processes with a prescribed
correlation function. Numerical results are presented.
|
56 |
Lösung parabolischer Differentialgleichungen mit zufälligen Randbedingungen mittels FEMKandler, Anne, vom Scheidt, Jürgen, Unger, Roman 31 August 2004 (has links)
In dieser Arbeit werden stochastische Charakteristiken der Lösung parabolischer
Differentialgleichungen mit zufälligen
Neumann-Randbedingungen mit Hilfe der
Finite-Elemente-Methode angegeben. Dabei wird der Berechnung der Korrelations- bzw.
Varianzfunktion besondere Bedeutung beigemessen. Das stochastische Randanfangswertproblem
wird durch Anwendung von FEM-Techniken durch ein System
gewöhnlicher Differentialgleichungen mit stochastischen inhomogenen Termen approximiert.
Die Modellierung der stochastischen Eingangsparameter durch epsilon-korrelierte
Felder gestattet Entwicklungen der Lösungscharakteristiken nach der Korrelationslänge.
Numerische Beispiele enthalten den Vergleich zwischen analytischen
Ergebnissen und Simulationsresultaten.
|
57 |
Market Risk: Exponential Weightinh in the Value-at-Risk CalculationBroll, Udo, Förster, Andreas, Siebe, Wilfried 03 September 2020 (has links)
When measuring market risk, credit institutions and Alternative Investment Fund Managers may deviate from equally weighting historical data in their Value-at-Risk calculation and instead use an exponential time series weighting. The use of expo-nential weighting in the Value-at-Risk calculation is very popular because it takes into account changes in market volatility (immediately) and can therefore quickly adapt to VaR. In less volatile market phases, this leads to a reduction in VaR and thus to lower own funds requirements for credit institutions. However, in the ex-ponential weighting a high volatility in the past is quickly forgotten and the VaR can be underestimated when using exponential weighting and the VaR may be un-derestimated. To prevent this, credit institutions or Alternative Investment Fund Managers are not completely free to choose a weighting (decay) factor. This article describes the legal requirements and deals with the calculation of the permissible weighting factor. As an example we use the exchange rate between Euro and Polish zloty to estimate the Value-at-Risk. We show the calculation of the weighting factor with two different approaches. This article also discusses exceptions to the general legal requirements.
|
58 |
SARIMA Short to Medium-Term Forecasting and Stochastic Simulation of Streamflow, Water Levels and Sediments Time Series from the HYDAT DatabaseStitou, Adnane 28 October 2019 (has links)
This study aims to investigate short-to-medium forecasting and simulation of streamflow, water levels, and sediments in Canada using Seasonal Autoregressive Integrated Moving Average (SARIMA) time series models. The methodology can account for linear trends in the time series that may result from climate and environmental changes. A Universal Canadian forecast Application using python web interface was developed to generate short-term forecasts using SARIMA. The Akaike information criteria was used as performance criteria for generating efficient SARIMA models. The developed models were validated by analyzing the residuals. Several stations from the Canadian Hydrometric Database (HYDAT) displaying a linear upward or downward trend were identified to validate the methodology. Trends were detected using the Man-Kendall test.
The Nash-Sutcliffe efficiency coefficients (Nash ad Sutcliffe, 1970) of the developed models indicate that they are acceptable. The models can be used for short term (1 to 7 days) and medium-term (7 days to six months) forecasting of streamflow, water levels and sediments at all Canadian hydrometric stations. Such a forecast can be used for water resources management and help mitigate the effects of floods and droughts. The models can also be used to generate long time-series that can be used to test the performance of water resources systems.
Finally, we have automated the process of analysis, model-building and forecasting streamflow, water levels, and sediments by building a python-based application easily extendable and user-friendly. Therefore, automating the SARIMA calibration and forecasting process for all Canadian stations for the HYDAT database will prove to be a very useful tool for decision-makers and other entities in the field of hydrological study.
|
59 |
Prediction of traffic flow in cloud computing at a service provider.Sekwatlakwatla, Prince 11 1900 (has links)
M. Tech. (Department of Information Technology, Faculty of Applied and Computer Sciences) Vaal University of Technology. / Cloud computing provides improved and simplified IT management and maintenance capabilities through central administration of resources. Companies of all shapes and sizes are adapting to this new technology. Although cloud computing is an attractive concept to the business community, it still has some challenges such as traffic management and traffic prediction that need to be addressed. Most cloud service providers experience traffic congestion.
In the absence of effective tools for cloud computing traffic prediction, the allocation of resources to clients will be ineffective thus driving away cloud computing users. This research intends to mitigate the effect of traffic congestion on provision of cloud service by proposing a proactive traffic prediction model that would play an effective role in congestion control and estimation of accurate future resource demand. This will enhance the accuracy of traffic flow prediction in cloud computing by service providers. This research will evaluate to determine the performance between Auto-regressive Integrated Moving Average (ARIMA) and Artificial Neural Networks (ANN) as prediction tools for cloud computing traffic.
These two techniques were tested by using simulation to predict traffic flow per month and per year. The dataset was downloaded data taken from CAIDA database. The two algorithms Auto-Regressive Integrated Moving Average (ARIMA) and Artificial Neural Networks (ANN) where implemented and tested separately. Experimental results were generated and analyzed to test the effectiveness of the traffic prediction algorithms. Finally, the findings indicated that ARIMA can have 98 % accurate prediction results while ANN produced 89 % accurate prediction results. It was also observed that both models perform better on monthly data as compared to yearly data. This study recommends ARIMA algorithm for data flow prediction in private cloud computing
|
60 |
Handelsstrategier baserade på glidande medelvärden : En studie i marknadens effektivitetBrished, Gustav, Roos, Erik January 2023 (has links)
Att finna den mest effektiva strategin för att maximera sin avkastning på aktiemarknaden har varit en fråga som har intresserat investerare i hundratals år. Denna studie avser att undersöka vilken av investeringsstrategierna, Gyllene korset eller Buy and hold som är mest lönsam under perioden 2004 - 2022 på Stockholmsbörsen för att dra slutsatser om marknadens effektivitet. Genom att mäta avkastningen av tio aktier från large-cap listan och tio aktier från mid-cap-listan visade studiens resultat att Buy and hold under perioden gav en högre genomsnittlig avkastning relativt Gyllene korset under parametrarna glidande medelvärde 50, 200. Detta ger stöd för den effektiva marknadshypotesen som säger att det inte går att få överavkastning genom teknisk analys. Studien finner dock stöd för att köpsignalen som gavs vid Gyllene korset skapade stora vinster, och att det snarare var en försenad säljsignal, dödskorset, som var huvudanledningen till att Buy and hold var överlägsen Gyllene korset-strategin. / Finding the most effective strategy to maximize returns in the stock market has been a question that has interested investors for hundreds of years. This study aims to see which of the investment strategies, Golden cross or Buy and hold that is most profitable during the period 2004 - 2022 on the Stockholm Stock Exchange to draw conclusions about the efficiency of the market. By measuring the return of ten stocks from the large-cap list and ten stocks from the mid-cap list, the study's results showed that during the period, Buy and hold gave a higher average return than the Golden cross under the parameters moving average 50, 200. This supports the efficient market hypothesis, which states that it is not possible to obtain excess returns through technical analysis. However, the study finds support that the buy signal given at the Golden cross created large returns, and that it was rather a delayed sell signal, the death cross, that was the reason why Buy and hold was superior to the Golden cross strategy.
|
Page generated in 0.0485 seconds