61 |
Uma avaliação estatística da análise gráfica no mercado de ações brasileiro à luz da teoria dos mercados eficientes e das finanças comportamentais / An statistical evaluation of the technical analysis in the Brazilian stock market in the light of the efficient market hypothesis and the behavioral financePenteado, Marco Antonio de Barros 27 August 2003 (has links)
Partindo dos conceitos estabelecidos pela Hipótese dos Mercados Eficientes (HME), a qual questiona a validade da Análise Gráfica, e considerando as críticas feitas à HME pelos defensores das assim chamadas Finanças Comportamentais, e outros, este estudo procurou detectar a existência de uma relação entre os sinais gráficos observados no dia-a-dia do mercado de ações brasileiro e as tendências que lhes sucedem, durante um período de 8 anos, para um número de papéis. Os resultados obtidos neste trabalho evidenciam a existência de tal relação, sugerindo a validade da utilização da Análise Gráfica como instrumento para a previsão de preços no mercado de ações brasileiro, no período considerado. / Based on the principles established by the Efficient Market Hypothesis (EMH), which argues that the Technical Analysis is of no value in order to predict future prices of securities, and considering the criticism to the EMH by the advocates of the so called Behavioral Finance, and others, this work tried to detect the existence of a relationship between the graphic signals observed day by day in the Brazilian stock market and the trends which happen after these signals, within a period of 8 years, for a number of securities. The results obtained from this study offer evidence of the existence of such relationship, suggesting the validity of the Technical Analysis as an instrument to predict security prices in the Brazilian stock market within that period.
|
62 |
Implementação de algoritmo metaheurístico simulated annealing para problema de seleção de contingência em análise de segurança de redes elétricasTomazi, Fausto Stefanello 23 September 2016 (has links)
Submitted by Silvana Teresinha Dornelles Studzinski (sstudzinski) on 2016-12-21T11:49:01Z
No. of bitstreams: 1
Fausto Stefanello Tomazi_.pdf: 1429293 bytes, checksum: 4e85a45b348c5d3cbf6a7e9e13e1be3b (MD5) / Made available in DSpace on 2016-12-21T11:49:02Z (GMT). No. of bitstreams: 1
Fausto Stefanello Tomazi_.pdf: 1429293 bytes, checksum: 4e85a45b348c5d3cbf6a7e9e13e1be3b (MD5)
Previous issue date: 2016-09-23 / Nenhuma / Os sistemas de potência desempenham um papel fundamental na economia de uma nação, fornecendo energia elétrica com qualidade e sem interrupções a população. Para que isto seja possível grandes investimentos no setor são aplicados para garantir o fornecimento. No entanto, qualquer equipamento está sujeito a falhas, e analisar o impacto que falhas em equipamento afetam o fornecimento é uma das tarefas executadas pelos centros de controle, chamada de Análise de Segurança. Desta forma, os centros de controle são responsáveis por realizar planos de contingência para que em caso de algum equipamento saia de operação o impacto sofrido pela rede seja o menor possível. Uma importante tarefa da Análise de Segurança é a Seleção de Contingências. Esta tarefa sendo encarregada de selecionar os equipamentos mais importantes do sistema para que a tarefa de Análise de Segurança possa criar planos de prevenção caso os respectivos equipamentos saiam de operação. Os grandes sistemas elétricos existentes hoje são compostos de milhares de equipamentos, e uma análise mais detalhada para cada equipamento é algo de difícil resolução, sendo neste cenário que a seleção de contingência ganha importância. A Seleção de Contingência é encarregada de buscar e classificar as restrições mais importantes da rede, porem para redes de grande porte com milhares de itens, analisar o impacto de cada item é uma tarefa que pode levar muito tempo, não permitindo que o cálculo seja efetuado durante a operação do sistema. Desta forma faz-se necessário executar a Seleção de Contingências de forma eficiente e eficaz. Este estudo propõe o desenvolvimento do algoritmo metaheurístico de Simulated Annealing a fim de que a seleção de contingência seja executada de forma que atenda todas as restrições de tempo impostas pelos centros de controle. Nos experimentos é possível verificar que após uma sintonia de parâmetros para a instancia do problema abordado, os resultados encontrados atende as restrições dos centros de controle e também é possível visualizar que os resultados são ligeiramente melhores que resultados de trabalhos encontrados na literatura, onde o mesmo problema é abordado pela metaheurística do Algoritmo Genético. / Power systems play a key role in a nation's economy by providing quality, uninterrupted power to the population. For this to be possible large investments in the sector are applied to guarantee the supply. However, any equipment is subject to failures, and analyzing the impact that equipment failures affect supply is one of the tasks performed by control centers, called Safety Analysis. In this way, the control centers are responsible for carrying out contingency plans so that in the event of any equipment leaving the operation the impact suffered by the network is as small as possible. An important task of Security Analysis is the Selection of Contingencies. This task is in charge of selecting the most important equipment in the system so that the Security Analysis task can create prevention plans if the respective equipment goes out of operation. The large electrical systems that exist today are made up of thousands of equipment, and a more detailed analysis for each equipment is difficult to solve, and in this scenario contingency selection is important. The Contingency Selection is responsible for searching and classifying the most important restrictions of the network, but for large networks with thousands of items, analyzing the impact of each item is a task that can take a long time, not allowing the calculation to be performed During system operation. In this way it is necessary to perform the Contingency Selection efficiently and effectively. This study proposes the development of the metaheuristic algorithm of Simulated Annealing in order that the contingency selection is performed in a way that meets all the time constraints imposed by the control centers. In the experiments it is possible to verify that after a tuning of parameters for the instance of the problem approached, the results found meets the control center constraints and it is also possible to visualize that the results are slightly better than results of works found in the literature, where the same Problem is addressed by the metaheuristic of the Genetic Algorithm.
|
63 |
Voluntary Disclosure of Non-Financial Key Performance Indicators during Earnings ReleasesPhan, Lan 01 January 2019 (has links)
Almost two decades after the burst of the Dot-com bubble, investors are opinionated as to whether a new technology bubble has formed in the equities market. Similar to the late 1990's and early 2000's, many Internet firms today go through initial public offering without yet turning over a dollar of earnings, but boast certain revenue-associated performance metrics to investors promising of future success. However, investors are known to hold sentiments sensitive to earnings announcements (Seok, Cho & Ryu, 2019) and reward firms which meet or beat earnings with higher stock returns (Bartov, Givoly & Hayn, 2002). That raises a question on the content of earnings announcements: Besides earnings and cash flow, are there other factors that may influence investor decisions to trade some Internet stocks?
My primary hypothesis is that the voluntary disclosure of specific non-financial key performance indicators (NFKPI) during earnings announcement by Internet firms influences the investors' investing/trading decisions. My motivation for this research is to understand better whether there is a strategic element in the voluntary disclosure of NFKPI in Internet companies and how it may impact investors' decisions. The results could be useful to firms in their evaluations of whether to release NFKPI or similar information and to equity research analysts as well as investors in measuring their expectations and valuations of the firms' stocks. The intention of the study is not to generalize the findings to the full market, as the number of companies with the practice of voluntary disclosure of NFKPI is comparatively few compared to those without the practice. Instead, this study examines the effects of NFKPI on the stock returns of those companies which choose to disclose it.
I use event study methodology to test the statistical significance of disclosure of NFKPIs during earnings announcements. By controlling for earnings surprise and other meaningful financial ratios, I also examine how the signaling effect of NFKPI could be distinguished from the signaling effects of important information concurrently released during earnings announcements. I focus on two types of NFKPI within the Internet industry: Gross Bookings for online booking agency services and Daily Active Users for social media. As earnings reports and quarterly filings often do not necessarily come together on the same date, I hand-collected data to estimate the surprise effect of NFKPI per earnings announcement, by using available broker forecasts of the respective NFKPI as a proxy for the investor's NFKPI expectation.
The results show that while revenue surprise remains consistently the most influential variable to investors, NFKPI Surprise has a positive, statistically significant relationship with the firm's abnormal returns. Additionally, despite being insignificant when expected earnings is beat or in line with consensus, NFKPI Surprise is found statistically significant with a positive relationship to abnormal returns when expected earnings is missed. In line with existing research on management's motivation to prevent negative earnings surprises (Matsumoto, 2002), these findings imply that if firms could employ the voluntary disclosure of NFKPI to manipulate investors' impression and to cushion their stock prices against potential negative market reactions when earnings is missed.
|
64 |
Volatility Interruptions, idiosyncratic risk, and stock returnAlsunbul, Saad A 23 May 2019 (has links)
The objective of this paper is to examine the impact of implementing the static and dynamic volatility interruption rule on idiosyncratic volatility and stock returns in Nasdaq Stockholm. Using EGARCH and GARCH models to estimate the conditional idiosyncratic volatility, we find that the conditional idiosyncratic volatility and stock returns increase as stock prices hit the upper static or dynamic volatility interruption limits. Conversely, we find that the conditional idiosyncratic volatility and stock returns decrease as stock prices hit the lower static or dynamic volatility interruption limit. We also find that the conditional idiosyncratic volatility is higher when stock prices reach the upper dynamic limit than when they reach the upper static limit. Furthermore, we compare the conditional idiosyncratic volatility and stock returns on the limit hit days to the day before and after the limit hit events and find that the conditional idiosyncratic volatility and stock returns are more volatile on the limits hit days. To test the volatility spill-over hypothesis, we set a range of a two-day window after limit hit events and find no evidence for volatility spill-over one or two days after the limit hit event, indicating that the static and dynamic volatility interruption rule is effective in curbing the volatility. Finally, we sort stocks by their size and find that small market cap stocks gain higher returns than larger market cap stocks upon reaching the upper limits, both static and dynamic.
|
65 |
Finding Profitability of Technical Trading Rules in Emerging Market Exchange Traded FundsHallett, Austin P. 01 January 2012 (has links)
This thesis further investigates the effectiveness of 15 variable moving average strategies that mimic the trading rules used in the study by Brock, Lakonishok, and LeBaron (1992). Instead of applying these strategies to developed markets, unique characteristics of emerging markets offer opportunity to investors that warrant further research. Before transaction costs, all 15 variable moving average strategies outperform the naïve benchmark strategy of buying and holding different emerging market ETF's over the volatile period of 858 trading days. However, the variable moving averages perform poorly in the "bubble" market cycle. In fact, sell signals become more unprofitable than buy signals are profitable. Furthermore, variations of 4 of 5 variable moving average strategies demonstrate significant prospects of returning consistent abnormal returns after adjusting for transaction costs and risk.
|
66 |
Inter-Area Data Exchange Performance Evaluation and Complete Network Model ImprovementSu, Chun-Lien 20 June 2001 (has links)
A power system is typically one small part of a larger interconnected network and is affected to a varying degree, by contingencies external to itself as well as by the reaction of external network to its own contingencies. Thus, the accuracy of a complete interconnected network model would affect the results of many transmission level analyses. In an interconnected power system, the real-time network security and power transfer capability analyses require a ¡§real-time¡¨ complete network base case solution. In order to accurately assess the system security and the inter-area transfer capability, it is highly desirable that any available information from all areas is used. With the advent of communications among operations control center computers, real-time telemetered data can be exchanged for complete network modeling. Measurement time skew should be considered in the complete network modeling when combining large area data received via a data communication network.
In this dissertation, several suggestions aiming toward the improvement of complete network modeling are offered. A discrete event simulation technique is used to assess the performance of a data exchange scheme that uses Internet interface to the SCADA system. Performance modeling of data exchange on the Internet is established and a quantitative analysis of the data exchange delay is presented. With the prediction mechanisms, the effect of time skew of interchanged data among utilities can be minimized, and consequently, state estimation (SE) could provide the accurate real-time complete network models of the interconnected network for security and available transfer capability analyses.
In order to accommodate the effects of randomly varying arrival of measurement data and setup a base case for more accurate analyses of network security and transfer capability, an implementation of a stochastic Extended Kalman Filter (EKF) algorithm is proposed to provide optimal estimates of interconnected network states for systems in which some or all measurements are delayed. To have an accurate state estimation of a complete network, it is essential to have the capability of detecting bad data in the model. An efficient information debugging methodology based on the stochastic EKF algorithm is used for the detection, diagnosis and elimination of bad data.
|
67 |
A Current-Based Preventive Security-Constrained Optimal Power Flow by Particle Swarm OptimizationZhong, Yi-Shun 14 February 2008 (has links)
An Equivalent Current Injection¡]ECI¡^based Preventive Security-
Constrained Optimal Power Flow¡]PSCOPF¡^is presented in this paper
and a particle swarm optimization (PSO) algorithm is developed for
solving non-convex Optimal Power Flow (OPF) problems. This thesis
integrated Simulated Annealing Particle Swarm Optimization¡]SAPSO¡^
and Multiple Particle Swarm Optimization¡]MPSO¡^, enabling a fast
algorithm to find the global optimum. Optimal power flow is
solved based on Equivalent- Current Injection¡]ECIOPF¡^algorithm. This
OPF deals with both continuous and discrete control variables and is a
mixed-integer optimal power flow¡]MIOPF¡^. The continuous control
variables modeled are the active power output and generator-bus voltage
magnitudes, while the discrete ones are the shunt capacitor devices. The
feasibility of the proposed method is exhibited for a standard IEEE 30 bus
system, and it is compared with other stochastic methods for the solution
quality. Security Analysis is also conducted. Ranking method is used to
highlight the most severe event caused by a specific fault. A preventive
algorithm will make use of the contingency information, and keep the
system secure to avoid violations when fault occurs. Generators will be
used to adjust the line flow to the point that the trip of the most severe line
would not cause a major problem.
|
68 |
A new proposed method of contingency rankingGossman, Stephanie Mizzell 18 May 2010 (has links)
Security analysis of a power system requires a process called contingency analysis that analyzes results from all possible single contingencies (i.e. outages) in the system. The process of contingency analysis requires the definition of a parameter that is used to monitor a certain aspect of the system, which is called a performance index. The performance index definitions used traditionally have been highly nonlinear, and the results have not accurately predicted the outcome of the performance index in some cases. These incorrect results are referred to as misrankings since the contingency results are usually placed in order of severity so that the most severe cases are evident.
This thesis considers a new definition of contingency ranking using a more linearized definition of the performance index. The construction of both the new, proposed definition and the classic definition both consider the current loading of circuits in the system as compared to their rated values. Specifically, the parameter measured by the proposed definition measures the difference, while the more nonlinear definition uses a ratio of the two quantities, which is then raised to a higher power.
A small, four bus test system is used to demonstrate the benefits of the new, more linearized definition. The average percent error for all single line contingencies of the system decreased by over 9.5% using the proposed definition as compared to the previous one. This decrease in error allows this performance index to monitor a similar parameter (comparing current loading and current rating of the lines) and achieve a higher degree of accuracy. Further linearization of this proposed definition also shows a reduction in the average percent error by an additional 22% so that when compared to the original, highly nonlinear definition, the average error is reduced by almost 30%. By linearizing the definition of the performance index, the results are more accurate and misrankings are less likely to occur from the security analysis process.
|
69 |
Uma avaliação estatística da análise gráfica no mercado de ações brasileiro à luz da teoria dos mercados eficientes e das finanças comportamentais / An statistical evaluation of the technical analysis in the Brazilian stock market in the light of the efficient market hypothesis and the behavioral financeMarco Antonio de Barros Penteado 27 August 2003 (has links)
Partindo dos conceitos estabelecidos pela Hipótese dos Mercados Eficientes (HME), a qual questiona a validade da Análise Gráfica, e considerando as críticas feitas à HME pelos defensores das assim chamadas Finanças Comportamentais, e outros, este estudo procurou detectar a existência de uma relação entre os sinais gráficos observados no dia-a-dia do mercado de ações brasileiro e as tendências que lhes sucedem, durante um período de 8 anos, para um número de papéis. Os resultados obtidos neste trabalho evidenciam a existência de tal relação, sugerindo a validade da utilização da Análise Gráfica como instrumento para a previsão de preços no mercado de ações brasileiro, no período considerado. / Based on the principles established by the Efficient Market Hypothesis (EMH), which argues that the Technical Analysis is of no value in order to predict future prices of securities, and considering the criticism to the EMH by the advocates of the so called Behavioral Finance, and others, this work tried to detect the existence of a relationship between the graphic signals observed day by day in the Brazilian stock market and the trends which happen after these signals, within a period of 8 years, for a number of securities. The results obtained from this study offer evidence of the existence of such relationship, suggesting the validity of the Technical Analysis as an instrument to predict security prices in the Brazilian stock market within that period.
|
70 |
Static analysis of implicit control flow: resolving Java reflection and Android intentsSILVA FILHO, Paulo de Barros e 04 March 2016 (has links)
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-08-08T12:21:17Z
No. of bitstreams: 2
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
2016-pbsf-msc.pdf: 596422 bytes, checksum: be9375166fe6e850180863e08b7997d8 (MD5) / Made available in DSpace on 2016-08-08T12:21:17Z (GMT). No. of bitstreams: 2
license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5)
2016-pbsf-msc.pdf: 596422 bytes, checksum: be9375166fe6e850180863e08b7997d8 (MD5)
Previous issue date: 2016-03-04 / FACEPE / Implicit or indirect control flow allows a transfer of control to a procedure without having
to call the procedure explicitly in the program. Implicit control flow is a staple design pattern
that adds flexibility to system design. However, it is challenging for a static analysis to compute
or verify properties about a system that uses implicit control flow.
When a static analysis encounters a procedure call, the analysis usually approximates
the call’s behavior by a summary, which conservatively generalizes the effects of any target of
the call. In previous work, a static analysis that verifies security properties was developed for
Android apps, but failed to achieve high precision in the presence of implicit control flow.
This work presents static analyses for two types of implicit control flow that frequently
appear in Android apps: Java reflection and Android intents. In our analyses, the summary
of a method is the method’s signature. Our analyses help to resolve where control flows and
what data is passed. This information improves the precision of downstream analyses, which no
longer need to make conservative assumptions about implicit control flow, while maintaining the
soundness.
We have implemented our techniques for Java. We enhanced an existing security analysis
with a more precise treatment of reflection and intents. In a case study involving ten real-world
Android apps that use both intents and reflection, the precision of the security analysis was
increased on average by two orders of magnitude. The precision of two other downstream
analyses was also improved. / Fluxo de controle implícito, ou indireto, permite que haja uma transferência de controle para um procedimento sem que esse procedimento seja invocado de forma explícita pelo programa. Fluxo de controle implícito é um padrão de projeto comum e bastante utilizado na prática, que adiciona flexibilidade no design de um sistema. Porém, é um desafio para uma análise estática ter que computar e verificar propriedades sobre um sistema que usa fluxos de controle implícito. Quando uma análise estática encontra uma chamada a uma procedimento, geralmente a análise aproxima o comportamento da chamada de acordo com o sumário do método, generalizando de uma forma conservadora os efeitos da chamada ao procedimento. Em trabalho anterior, uma análise estática de segurança foi desenvolvida para aplicações Android, mas falhou em obter uma alta precisão na presença de fluxos de controle implícito. Este trabalho apresenta uma análise estática para dois tipos de fluxos de controle implícito que aparecem frequentemente em aplicações Android: Java reflection e Android intents. Nas nossas análises, o sumário de um método é a assinatura do método. Nossas análises ajudam a descobrir para onde o controle flui e que dados estão sendo passados. Essa informação melhora a precisão de outras análises estáticas, que não precisam mais tomar medidas conservadoras na presença de fluxo de controle implícito. Nós implementamos a nossa técnica em Java. Nós melhoramos uma análise de segurança existente através de um tratamento mais preciso em casos de reflection e intents. Em um estudo de caso envolvendo dez aplicações Android reais que usam reflection e intents, a precisão da análise de segurança aumentou em duas ordens de magnitude. A precisão de outras duas análises estáticas também foi melhorada.
|
Page generated in 0.4769 seconds