• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 93
  • 93
  • 41
  • 34
  • 32
  • 29
  • 16
  • 14
  • 13
  • 11
  • 11
  • 11
  • 10
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Two Essays in Finance and Economics: “Investment Opportunities in Commodity and Stock Markets for G7 Countries” And “Global and Local Factors Affecting Sovereign Yield Spreads”

Izadi, Selma 18 December 2015 (has links)
In chapter 1, I investigate the return links and dynamic conditional correlations between the equity and commodity returns for G7 countries from 2000:01 to 2014:10. The commodity futures include BCOM Index which contains the futures and spot price of 22 commodities, Brent and Crude oil futures, gold and silver futures, Wheat, Corn and Soybean futures and CRB index. The finding indicates that during the full sample period GOLD, WHEAT and CORN have the smallest dynamic conditional correlations with all the Equity indexes. In addition, the correlations between the GOLD/Equity pairs are negative during the financial crisis. This fact indicates the benefit of hedging the stock portfolios with gold futures while we have stress in the financial markets. The results from hedging effectiveness suggest that all the commodity/stock portfolios provide better diversification benefits than the stock portfolios. In average, including CRB, BCOM and GOLD futures to the stock portfolios have the highest hedging effectiveness ratios. Chapter 2 investigates the impact of global and local variables on the Sovereign bond spreads for 22 developed countries in North America, Europe and Pacific Rim Regions, using monthly data from January 2010 to March 2015. There are a few main findings of this chaper. First, the global factors are considerably more important in déterminant the sovereign bond spreads for all the regions. Second, for the bond spread of each region over its local government bond, the countries’ domestic fundamentals are found to be more influential determinants of the spreads, compared to the spread over US government bond as a safe haven government bond. Third, the bond spreads in the Eurozone area is less influenced by the global factors compared to the other regions. Fourth, the sovereign bond spreads of all regions are positively related to the US corporate high yield spreads as a proxy of market sentiment and the log of VIX index as measurement for the investor risk aversion. The coefficient of the log of VIX index shows the strong power of the stock market implied volatility on determining the yield spreads in the fixed income market.
62

Feats and Failures of Corporate Credit Risk, Stock Returns, and the Interdependencies of Sovereign Credit Risk

Isiugo, Uche C 10 August 2016 (has links)
This dissertation comprises two essays; the first of which investigates sovereign credit risk interdependencies, while the second examines the reaction of corporate credit risk to sovereign credit risk events. The first essay titled, Characterizing Sovereign Credit Risk Interdependencies: Evidence from the Credit Default Swap Market, investigates the relationships that exist among disparate sovereign credit default swaps (CDS) and the implications on sovereign creditworthiness. We exploit emerging market sovereign CDS spreads to examine the reaction of sovereign credit risk to changes in country-specific and global financial factors. Utilizing aVAR model fitted with DCC GARCH, we find that comovements of spreads generally exhibit significant time-varying correlations, suggesting that spreads are commonly affected by global financial factors. We construct 19 country-specific commodity price indexes to instrument for country terms of trade, obtaining significant results. Our commodity price indexes account for significant variation in CDS spreads, controlling for global financial factors. In addition, sovereign spreads are found to be related to U.S. stock market returns and the VIX volatility risk premium global factors. Notwithstanding, our results suggest that terms of trade and commodity prices have a statistically and economically significant effect on the sovereign credit risk of emerging economies. Our results apply broadly to investors, financial institutions and policy makers motivated to utilize profitable factors in global portfolios. The second essay is titled, Differential Stock Market Returns and Corporate Credit Risk of Listed Firms. This essay explores the information transfer effect of shocks to sovereign credit risk as captured in the CDS and stock market returns of cross-listed and local stock exchange listed firms. Based on changes in sovereign credit ratings and outlooks, we find that widening CDS spreads of firms imply that negative credit events dominate, whereas tightening spreads indicate positive events. Grouping firms into companies with cross-listings and those without, we compare the spillover effects and find strong evidence of contagion across equity and CDS markets in both company groupings. Our findings suggest that the sensitivity of corporate CDS prices to sovereign credit events is significantly larger for non-cross-listed firms. Possible reasons for this finding could in fact be due to cross-listed firms’ better access to external capital and less degree of asymmetric information, relative to non-cross-listed peers with lower level of investor recognition. Our results provide new evidence relevant to investors and financial institutions in determining sovereign credit risk germane to corporate financial risk, for the construction of debt and equity portfolios, and hedging considerations in today’s dynamic environment.
63

Uma avaliação estatística da análise gráfica no mercado de ações brasileiro à luz da teoria dos mercados eficientes e das finanças comportamentais / An statistical evaluation of the technical analysis in the Brazilian stock market in the light of the efficient market hypothesis and the behavioral finance

Penteado, Marco Antonio de Barros 27 August 2003 (has links)
Partindo dos conceitos estabelecidos pela Hipótese dos Mercados Eficientes (HME), a qual questiona a validade da Análise Gráfica, e considerando as críticas feitas à HME pelos defensores das assim chamadas Finanças Comportamentais, e outros, este estudo procurou detectar a existência de uma relação entre os sinais gráficos observados no dia-a-dia do mercado de ações brasileiro e as tendências que lhes sucedem, durante um período de 8 anos, para um número de papéis. Os resultados obtidos neste trabalho evidenciam a existência de tal relação, sugerindo a validade da utilização da Análise Gráfica como instrumento para a previsão de preços no mercado de ações brasileiro, no período considerado. / Based on the principles established by the Efficient Market Hypothesis (EMH), which argues that the Technical Analysis is of no value in order to predict future prices of securities, and considering the criticism to the EMH by the advocates of the so called Behavioral Finance, and others, this work tried to detect the existence of a relationship between the graphic signals observed day by day in the Brazilian stock market and the trends which happen after these signals, within a period of 8 years, for a number of securities. The results obtained from this study offer evidence of the existence of such relationship, suggesting the validity of the Technical Analysis as an instrument to predict security prices in the Brazilian stock market within that period.
64

Implementação de algoritmo metaheurístico simulated annealing para problema de seleção de contingência em análise de segurança de redes elétricas

Tomazi, Fausto Stefanello 23 September 2016 (has links)
Submitted by Silvana Teresinha Dornelles Studzinski (sstudzinski) on 2016-12-21T11:49:01Z No. of bitstreams: 1 Fausto Stefanello Tomazi_.pdf: 1429293 bytes, checksum: 4e85a45b348c5d3cbf6a7e9e13e1be3b (MD5) / Made available in DSpace on 2016-12-21T11:49:02Z (GMT). No. of bitstreams: 1 Fausto Stefanello Tomazi_.pdf: 1429293 bytes, checksum: 4e85a45b348c5d3cbf6a7e9e13e1be3b (MD5) Previous issue date: 2016-09-23 / Nenhuma / Os sistemas de potência desempenham um papel fundamental na economia de uma nação, fornecendo energia elétrica com qualidade e sem interrupções a população. Para que isto seja possível grandes investimentos no setor são aplicados para garantir o fornecimento. No entanto, qualquer equipamento está sujeito a falhas, e analisar o impacto que falhas em equipamento afetam o fornecimento é uma das tarefas executadas pelos centros de controle, chamada de Análise de Segurança. Desta forma, os centros de controle são responsáveis por realizar planos de contingência para que em caso de algum equipamento saia de operação o impacto sofrido pela rede seja o menor possível. Uma importante tarefa da Análise de Segurança é a Seleção de Contingências. Esta tarefa sendo encarregada de selecionar os equipamentos mais importantes do sistema para que a tarefa de Análise de Segurança possa criar planos de prevenção caso os respectivos equipamentos saiam de operação. Os grandes sistemas elétricos existentes hoje são compostos de milhares de equipamentos, e uma análise mais detalhada para cada equipamento é algo de difícil resolução, sendo neste cenário que a seleção de contingência ganha importância. A Seleção de Contingência é encarregada de buscar e classificar as restrições mais importantes da rede, porem para redes de grande porte com milhares de itens, analisar o impacto de cada item é uma tarefa que pode levar muito tempo, não permitindo que o cálculo seja efetuado durante a operação do sistema. Desta forma faz-se necessário executar a Seleção de Contingências de forma eficiente e eficaz. Este estudo propõe o desenvolvimento do algoritmo metaheurístico de Simulated Annealing a fim de que a seleção de contingência seja executada de forma que atenda todas as restrições de tempo impostas pelos centros de controle. Nos experimentos é possível verificar que após uma sintonia de parâmetros para a instancia do problema abordado, os resultados encontrados atende as restrições dos centros de controle e também é possível visualizar que os resultados são ligeiramente melhores que resultados de trabalhos encontrados na literatura, onde o mesmo problema é abordado pela metaheurística do Algoritmo Genético. / Power systems play a key role in a nation's economy by providing quality, uninterrupted power to the population. For this to be possible large investments in the sector are applied to guarantee the supply. However, any equipment is subject to failures, and analyzing the impact that equipment failures affect supply is one of the tasks performed by control centers, called Safety Analysis. In this way, the control centers are responsible for carrying out contingency plans so that in the event of any equipment leaving the operation the impact suffered by the network is as small as possible. An important task of Security Analysis is the Selection of Contingencies. This task is in charge of selecting the most important equipment in the system so that the Security Analysis task can create prevention plans if the respective equipment goes out of operation. The large electrical systems that exist today are made up of thousands of equipment, and a more detailed analysis for each equipment is difficult to solve, and in this scenario contingency selection is important. The Contingency Selection is responsible for searching and classifying the most important restrictions of the network, but for large networks with thousands of items, analyzing the impact of each item is a task that can take a long time, not allowing the calculation to be performed During system operation. In this way it is necessary to perform the Contingency Selection efficiently and effectively. This study proposes the development of the metaheuristic algorithm of Simulated Annealing in order that the contingency selection is performed in a way that meets all the time constraints imposed by the control centers. In the experiments it is possible to verify that after a tuning of parameters for the instance of the problem approached, the results found meets the control center constraints and it is also possible to visualize that the results are slightly better than results of works found in the literature, where the same Problem is addressed by the metaheuristic of the Genetic Algorithm.
65

Voluntary Disclosure of Non-Financial Key Performance Indicators during Earnings Releases

Phan, Lan 01 January 2019 (has links)
Almost two decades after the burst of the Dot-com bubble, investors are opinionated as to whether a new technology bubble has formed in the equities market. Similar to the late 1990's and early 2000's, many Internet firms today go through initial public offering without yet turning over a dollar of earnings, but boast certain revenue-associated performance metrics to investors promising of future success. However, investors are known to hold sentiments sensitive to earnings announcements (Seok, Cho & Ryu, 2019) and reward firms which meet or beat earnings with higher stock returns (Bartov, Givoly & Hayn, 2002). That raises a question on the content of earnings announcements: Besides earnings and cash flow, are there other factors that may influence investor decisions to trade some Internet stocks? My primary hypothesis is that the voluntary disclosure of specific non-financial key performance indicators (NFKPI) during earnings announcement by Internet firms influences the investors' investing/trading decisions. My motivation for this research is to understand better whether there is a strategic element in the voluntary disclosure of NFKPI in Internet companies and how it may impact investors' decisions. The results could be useful to firms in their evaluations of whether to release NFKPI or similar information and to equity research analysts as well as investors in measuring their expectations and valuations of the firms' stocks. The intention of the study is not to generalize the findings to the full market, as the number of companies with the practice of voluntary disclosure of NFKPI is comparatively few compared to those without the practice. Instead, this study examines the effects of NFKPI on the stock returns of those companies which choose to disclose it. I use event study methodology to test the statistical significance of disclosure of NFKPIs during earnings announcements. By controlling for earnings surprise and other meaningful financial ratios, I also examine how the signaling effect of NFKPI could be distinguished from the signaling effects of important information concurrently released during earnings announcements. I focus on two types of NFKPI within the Internet industry: Gross Bookings for online booking agency services and Daily Active Users for social media. As earnings reports and quarterly filings often do not necessarily come together on the same date, I hand-collected data to estimate the surprise effect of NFKPI per earnings announcement, by using available broker forecasts of the respective NFKPI as a proxy for the investor's NFKPI expectation. The results show that while revenue surprise remains consistently the most influential variable to investors, NFKPI Surprise has a positive, statistically significant relationship with the firm's abnormal returns. Additionally, despite being insignificant when expected earnings is beat or in line with consensus, NFKPI Surprise is found statistically significant with a positive relationship to abnormal returns when expected earnings is missed. In line with existing research on management's motivation to prevent negative earnings surprises (Matsumoto, 2002), these findings imply that if firms could employ the voluntary disclosure of NFKPI to manipulate investors' impression and to cushion their stock prices against potential negative market reactions when earnings is missed.
66

Volatility Interruptions, idiosyncratic risk, and stock return

Alsunbul, Saad A 23 May 2019 (has links)
The objective of this paper is to examine the impact of implementing the static and dynamic volatility interruption rule on idiosyncratic volatility and stock returns in Nasdaq Stockholm. Using EGARCH and GARCH models to estimate the conditional idiosyncratic volatility, we find that the conditional idiosyncratic volatility and stock returns increase as stock prices hit the upper static or dynamic volatility interruption limits. Conversely, we find that the conditional idiosyncratic volatility and stock returns decrease as stock prices hit the lower static or dynamic volatility interruption limit. We also find that the conditional idiosyncratic volatility is higher when stock prices reach the upper dynamic limit than when they reach the upper static limit. Furthermore, we compare the conditional idiosyncratic volatility and stock returns on the limit hit days to the day before and after the limit hit events and find that the conditional idiosyncratic volatility and stock returns are more volatile on the limits hit days. To test the volatility spill-over hypothesis, we set a range of a two-day window after limit hit events and find no evidence for volatility spill-over one or two days after the limit hit event, indicating that the static and dynamic volatility interruption rule is effective in curbing the volatility. Finally, we sort stocks by their size and find that small market cap stocks gain higher returns than larger market cap stocks upon reaching the upper limits, both static and dynamic.
67

Finding Profitability of Technical Trading Rules in Emerging Market Exchange Traded Funds

Hallett, Austin P. 01 January 2012 (has links)
This thesis further investigates the effectiveness of 15 variable moving average strategies that mimic the trading rules used in the study by Brock, Lakonishok, and LeBaron (1992). Instead of applying these strategies to developed markets, unique characteristics of emerging markets offer opportunity to investors that warrant further research. Before transaction costs, all 15 variable moving average strategies outperform the naïve benchmark strategy of buying and holding different emerging market ETF's over the volatile period of 858 trading days. However, the variable moving averages perform poorly in the "bubble" market cycle. In fact, sell signals become more unprofitable than buy signals are profitable. Furthermore, variations of 4 of 5 variable moving average strategies demonstrate significant prospects of returning consistent abnormal returns after adjusting for transaction costs and risk.
68

Inter-Area Data Exchange Performance Evaluation and Complete Network Model Improvement

Su, Chun-Lien 20 June 2001 (has links)
A power system is typically one small part of a larger interconnected network and is affected to a varying degree, by contingencies external to itself as well as by the reaction of external network to its own contingencies. Thus, the accuracy of a complete interconnected network model would affect the results of many transmission level analyses. In an interconnected power system, the real-time network security and power transfer capability analyses require a ¡§real-time¡¨ complete network base case solution. In order to accurately assess the system security and the inter-area transfer capability, it is highly desirable that any available information from all areas is used. With the advent of communications among operations control center computers, real-time telemetered data can be exchanged for complete network modeling. Measurement time skew should be considered in the complete network modeling when combining large area data received via a data communication network. In this dissertation, several suggestions aiming toward the improvement of complete network modeling are offered. A discrete event simulation technique is used to assess the performance of a data exchange scheme that uses Internet interface to the SCADA system. Performance modeling of data exchange on the Internet is established and a quantitative analysis of the data exchange delay is presented. With the prediction mechanisms, the effect of time skew of interchanged data among utilities can be minimized, and consequently, state estimation (SE) could provide the accurate real-time complete network models of the interconnected network for security and available transfer capability analyses. In order to accommodate the effects of randomly varying arrival of measurement data and setup a base case for more accurate analyses of network security and transfer capability, an implementation of a stochastic Extended Kalman Filter (EKF) algorithm is proposed to provide optimal estimates of interconnected network states for systems in which some or all measurements are delayed. To have an accurate state estimation of a complete network, it is essential to have the capability of detecting bad data in the model. An efficient information debugging methodology based on the stochastic EKF algorithm is used for the detection, diagnosis and elimination of bad data.
69

A Current-Based Preventive Security-Constrained Optimal Power Flow by Particle Swarm Optimization

Zhong, Yi-Shun 14 February 2008 (has links)
An Equivalent Current Injection¡]ECI¡^based Preventive Security- Constrained Optimal Power Flow¡]PSCOPF¡^is presented in this paper and a particle swarm optimization (PSO) algorithm is developed for solving non-convex Optimal Power Flow (OPF) problems. This thesis integrated Simulated Annealing Particle Swarm Optimization¡]SAPSO¡^ and Multiple Particle Swarm Optimization¡]MPSO¡^, enabling a fast algorithm to find the global optimum. Optimal power flow is solved based on Equivalent- Current Injection¡]ECIOPF¡^algorithm. This OPF deals with both continuous and discrete control variables and is a mixed-integer optimal power flow¡]MIOPF¡^. The continuous control variables modeled are the active power output and generator-bus voltage magnitudes, while the discrete ones are the shunt capacitor devices. The feasibility of the proposed method is exhibited for a standard IEEE 30 bus system, and it is compared with other stochastic methods for the solution quality. Security Analysis is also conducted. Ranking method is used to highlight the most severe event caused by a specific fault. A preventive algorithm will make use of the contingency information, and keep the system secure to avoid violations when fault occurs. Generators will be used to adjust the line flow to the point that the trip of the most severe line would not cause a major problem.
70

A new proposed method of contingency ranking

Gossman, Stephanie Mizzell 18 May 2010 (has links)
Security analysis of a power system requires a process called contingency analysis that analyzes results from all possible single contingencies (i.e. outages) in the system. The process of contingency analysis requires the definition of a parameter that is used to monitor a certain aspect of the system, which is called a performance index. The performance index definitions used traditionally have been highly nonlinear, and the results have not accurately predicted the outcome of the performance index in some cases. These incorrect results are referred to as misrankings since the contingency results are usually placed in order of severity so that the most severe cases are evident. This thesis considers a new definition of contingency ranking using a more linearized definition of the performance index. The construction of both the new, proposed definition and the classic definition both consider the current loading of circuits in the system as compared to their rated values. Specifically, the parameter measured by the proposed definition measures the difference, while the more nonlinear definition uses a ratio of the two quantities, which is then raised to a higher power. A small, four bus test system is used to demonstrate the benefits of the new, more linearized definition. The average percent error for all single line contingencies of the system decreased by over 9.5% using the proposed definition as compared to the previous one. This decrease in error allows this performance index to monitor a similar parameter (comparing current loading and current rating of the lines) and achieve a higher degree of accuracy. Further linearization of this proposed definition also shows a reduction in the average percent error by an additional 22% so that when compared to the original, highly nonlinear definition, the average error is reduced by almost 30%. By linearizing the definition of the performance index, the results are more accurate and misrankings are less likely to occur from the security analysis process.

Page generated in 0.0472 seconds