• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 431
  • 50
  • 49
  • 43
  • 23
  • 21
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 11
  • 6
  • 5
  • Tagged with
  • 749
  • 550
  • 189
  • 137
  • 110
  • 87
  • 83
  • 83
  • 72
  • 71
  • 69
  • 53
  • 50
  • 49
  • 46
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
551

Non-performing loans : An analysis of the relationship between non-performing loans and profitability among European banks.

Nordlinder, Elias, Sundell, Oliver January 2017 (has links)
During the last decade, many European banks have been troubled with low profitability, while the amount of non-performing loans (NPLs) has increased. This thesis investigates and analyses how the increasing amount of NPL affects banks profitability and the financial system. With econometric models using panel data we examined the relationship between NPL, banks profitability and the economic cycle (GDP-growth). This combined with qualitative economic theories provided a solid analysis of this relationship. We found strong evidence the NPL-ratio has a negative correlation with both the profitability of banks and the economic cycle. With these results in mind we think the NPLs need to be dealt with by the banks and authorities soon. In accordance with our result and analysis we came up with recommendations for the banks and authorities to deal with the issue. We recognize they need to improve the secondary markets for non-performing loans, lifting the loans from their balance sheets, increase the use of Asset Management Companies and improve the NPL-management within banks.
552

Three essays on exotic option pricing, multivariate Lévy processes and linear aggregation of panel models

Petkovic, Alexandre 16 March 2009 (has links)
This thesis is composed of three chapters that form two parts. The first part is composed of two chapters and studies problems related to the exotic option market. In the first chapter we are interested in a numerical problem. More precisely we derive closed-form approximations for the price of some exotic options in the Black and Scholes framework. The second chapter discusses the construction of multivariate Lévy processes with and without stochastic volatility. The second part is composed of one chapter. It deals with a completely different issue. There we will study the problem of individual and temporal aggregation in panel data models. / Doctorat en sciences économiques, Orientation économie / info:eu-repo/semantics/nonPublished
553

Essays on monetary policy, saving and investment

Lenza, Michèle 04 June 2007 (has links)
This thesis addresses three relevant macroeconomic issues: (i) why<p>Central Banks behave so cautiously compared to optimal theoretical<p>benchmarks, (ii) do monetary variables add information about<p>future Euro Area inflation to a large amount of non monetary<p>variables and (iii) why national saving and investment are so<p>correlated in OECD countries in spite of the high degree of<p>integration of international financial markets.<p><p>The process of innovation in the elaboration of economic theory<p>and statistical analysis of the data witnessed in the last thirty<p>years has greatly enriched the toolbox available to<p>macroeconomists. Two aspects of such a process are particularly<p>noteworthy for addressing the issues in this thesis: the<p>development of macroeconomic dynamic stochastic general<p>equilibrium models (see Woodford, 1999b for an historical<p>perspective) and of techniques that enable to handle large data<p>sets in a parsimonious and flexible manner (see Reichlin, 2002 for<p>an historical perspective).<p><p>Dynamic stochastic general equilibrium models (DSGE) provide the<p>appropriate tools to evaluate the macroeconomic consequences of<p>policy changes. These models, by exploiting modern intertemporal<p>general equilibrium theory, aggregate the optimal responses of<p>individual as consumers and firms in order to identify the<p>aggregate shocks and their propagation mechanisms by the<p>restrictions imposed by optimizing individual behavior. Such a<p>modelling strategy, uncovering economic relationships invariant to<p>a change in policy regimes, provides a framework to analyze the<p>effects of economic policy that is robust to the Lucas'critique<p>(see Lucas, 1976). The early attempts of explaining business<p>cycles by starting from microeconomic behavior suggested that<p>economic policy should play no role since business cycles<p>reflected the efficient response of economic agents to exogenous<p>sources of fluctuations (see the seminal paper by Kydland and Prescott, 1982}<p>and, more recently, King and Rebelo, 1999). This view was challenged by<p>several empirical studies showing that the adjustment mechanisms<p>of variables at the heart of macroeconomic propagation mechanisms<p>like prices and wages are not well represented by efficient<p>responses of individual agents in frictionless economies (see, for<p>example, Kashyap, 1999; Cecchetti, 1986; Bils and Klenow, 2004 and Dhyne et al. 2004). Hence, macroeconomic models currently incorporate<p>some sources of nominal and real rigidities in the DSGE framework<p>and allow the study of the optimal policy reactions to inefficient<p>fluctuations stemming from frictions in macroeconomic propagation<p>mechanisms.<p><p>Against this background, the first chapter of this thesis sets up<p>a DSGE model in order to analyze optimal monetary policy in an<p>economy with sectorial heterogeneity in the frequency of price<p>adjustments. Price setters are divided in two groups: those<p>subject to Calvo type nominal rigidities and those able to change<p>their prices at each period. Sectorial heterogeneity in price<p>setting behavior is a relevant feature in real economies (see, for<p>example, Bils and Klenow, 2004 for the US and Dhyne, 2004 for the Euro<p>Area). Hence, neglecting it would lead to an understatement of the<p>heterogeneity in the transmission mechanisms of economy wide<p>shocks. In this framework, Aoki (2001) shows that a Central<p>Bank maximizing social welfare should stabilize only inflation in<p>the sector where prices are sticky (hereafter, core inflation).<p>Since complete stabilization is the only true objective of the<p>policymaker in Aoki (2001) and, hence, is not only desirable<p>but also implementable, the equilibrium real interest rate in the<p>economy is equal to the natural interest rate irrespective of the<p>degree of heterogeneity that is assumed. This would lead to<p>conclude that stabilizing core inflation rather than overall<p>inflation does not imply any observable difference in the<p>aggressiveness of the policy behavior. While maintaining the<p>assumption of sectorial heterogeneity in the frequency of price<p>adjustments, this chapter adds non negligible transaction<p>frictions to the model economy in Aoki (2001). As a<p>consequence, the social welfare maximizing monetary policymaker<p>faces a trade-off among the stabilization of core inflation,<p>economy wide output gap and the nominal interest rate. This<p>feature reflects the trade-offs between conflicting objectives<p>faced by actual policymakers. The chapter shows that the existence<p>of this trade-off makes the aggressiveness of the monetary policy<p>reaction dependent on the degree of sectorial heterogeneity in the<p>economy. In particular, in presence of sectorial heterogeneity in<p>price adjustments, Central Banks are much more likely to behave<p>less aggressively than in an economy where all firms face nominal<p>rigidities. Hence, the chapter concludes that the excessive<p>caution in the conduct of monetary policy shown by actual Central<p>Banks (see, for example, Rudebusch and Svennsson, 1999 and Sack, 2000) might not<p>represent a sub-optimal behavior but, on the contrary, might be<p>the optimal monetary policy response in presence of a relevant<p>sectorial dispersion in the frequency of price adjustments.<p><p>DSGE models are proving useful also in empirical applications and<p>recently efforts have been made to incorporate large amounts of<p>information in their framework (see Boivin and Giannoni, 2006). However, the<p>typical DSGE model still relies on a handful of variables. Partly,<p>this reflects the fact that, increasing the number of variables,<p>the specification of a plausible set of theoretical restrictions<p>identifying aggregate shocks and their propagation mechanisms<p>becomes cumbersome. On the other hand, several questions in<p>macroeconomics require the study of a large amount of variables.<p>Among others, two examples related to the second and third chapter<p>of this thesis can help to understand why. First, policymakers<p>analyze a large quantity of information to assess the current and<p>future stance of their economies and, because of model<p>uncertainty, do not rely on a single modelling framework.<p>Consequently, macroeconomic policy can be better understood if the<p>econometrician relies on large set of variables without imposing<p>too much a priori structure on the relationships governing their<p>evolution (see, for example, Giannone et al. 2004 and Bernanke et al. 2005).<p>Moreover, the process of integration of good and financial markets<p>implies that the source of aggregate shocks is increasingly global<p>requiring, in turn, the study of their propagation through cross<p>country links (see, among others, Forni and Reichlin, 2001 and Kose et al. 2003). A<p>priori, country specific behavior cannot be ruled out and many of<p>the homogeneity assumptions that are typically embodied in open<p>macroeconomic models for keeping them tractable are rejected by<p>the data. Summing up, in order to deal with such issues, we need<p>modelling frameworks able to treat a large amount of variables in<p>a flexible manner, i.e. without pre-committing on too many<p>a-priori restrictions more likely to be rejected by the data. The<p>large extent of comovement among wide cross sections of economic<p>variables suggests the existence of few common sources of<p>fluctuations (Forni et al. 2000 and Stock and Watson, 2002) around which<p>individual variables may display specific features: a shock to the<p>world price of oil, for example, hits oil exporters and importers<p>with different sign and intensity or global technological advances<p>can affect some countries before others (Giannone and Reichlin, 2004). Factor<p>models mainly rely on the identification assumption that the<p>dynamics of each variable can be decomposed into two orthogonal<p>components - common and idiosyncratic - and provide a parsimonious<p>tool allowing the analysis of the aggregate shocks and their<p>propagation mechanisms in a large cross section of variables. In<p>fact, while the idiosyncratic components are poorly<p>cross-sectionally correlated, driven by shocks specific of a<p>variable or a group of variables or measurement error, the common<p>components capture the bulk of cross-sectional correlation, and<p>are driven by few shocks that affect, through variable specific<p>factor loadings, all items in a panel of economic time series.<p>Focusing on the latter components allows useful insights on the<p>identity and propagation mechanisms of aggregate shocks underlying<p>a large amount of variables. The second and third chapter of this<p>thesis exploit this idea.<p><p>The second chapter deals with the issue whether monetary variables<p>help to forecast inflation in the Euro Area harmonized index of<p>consumer prices (HICP). Policymakers form their views on the<p>economic outlook by drawing on large amounts of potentially<p>relevant information. Indeed, the monetary policy strategy of the<p>European Central Bank acknowledges that many variables and models<p>can be informative about future Euro Area inflation. A peculiarity<p>of such strategy is that it assigns to monetary information the<p>role of providing insights for the medium - long term evolution of<p>prices while a wide range of alternative non monetary variables<p>and models are employed in order to form a view on the short term<p>and to cross-check the inference based on monetary information.<p>However, both the academic literature and the practice of the<p>leading Central Banks other than the ECB do not assign such a<p>special role to monetary variables (see Gali et al. 2004 and<p>references therein). Hence, the debate whether money really<p>provides relevant information for the inflation outlook in the<p>Euro Area is still open. Specifically, this chapter addresses the<p>issue whether money provides useful information about future<p>inflation beyond what contained in a large amount of non monetary<p>variables. It shows that a few aggregates of the data explain a<p>large amount of the fluctuations in a large cross section of Euro<p>Area variables. This allows to postulate a factor structure for<p>the large panel of variables at hand and to aggregate it in few<p>synthetic indexes that still retain the salient features of the<p>large cross section. The database is split in two big blocks of<p>variables: non monetary (baseline) and monetary variables. Results<p>show that baseline variables provide a satisfactory predictive<p>performance improving on the best univariate benchmarks in the<p>period 1997 - 2005 at all horizons between 6 and 36 months.<p>Remarkably, monetary variables provide a sensible improvement on<p>the performance of baseline variables at horizons above two years.<p>However, the analysis of the evolution of the forecast errors<p>reveals that most of the gains obtained relative to univariate<p>benchmarks of non forecastability with baseline and monetary<p>variables are realized in the first part of the prediction sample<p>up to the end of 2002, which casts doubts on the current<p>forecastability of inflation in the Euro Area.<p><p>The third chapter is based on a joint work with Domenico Giannone<p>and gives empirical foundation to the general equilibrium<p>explanation of the Feldstein - Horioka puzzle. Feldstein and Horioka (1980) found<p>that domestic saving and investment in OECD countries strongly<p>comove, contrary to the idea that high capital mobility should<p>allow countries to seek the highest returns in global financial<p>markets and, hence, imply a correlation among national saving and<p>investment closer to zero than one. Moreover, capital mobility has<p>strongly increased since the publication of Feldstein - Horioka's<p>seminal paper while the association between saving and investment<p>does not seem to comparably decrease. Through general equilibrium<p>mechanisms, the presence of global shocks might rationalize the<p>correlation between saving and investment. In fact, global shocks,<p>affecting all countries, tend to create imbalance on global<p>capital markets causing offsetting movements in the global<p>interest rate and can generate the observed correlation across<p>national saving and investment rates. However, previous empirical<p>studies (see Ventura, 2003) that have controlled for the effects<p>of global shocks in the context of saving-investment regressions<p>failed to give empirical foundation to this explanation. We show<p>that previous studies have neglected the fact that global shocks<p>may propagate heterogeneously across countries, failing to<p>properly isolate components of saving and investment that are<p>affected by non pervasive shocks. We propose a novel factor<p>augmented panel regression methodology that allows to isolate<p>idiosyncratic sources of fluctuations under the assumption of<p>heterogenous transmission mechanisms of global shocks. Remarkably,<p>by applying our methodology, the association between domestic<p>saving and investment decreases considerably over time,<p>consistently with the observed increase in international capital<p>mobility. In particular, in the last 25 years the correlation<p>between saving and investment disappears.<p> / Doctorat en sciences économiques, Orientation économie / info:eu-repo/semantics/nonPublished
554

Aplikace spotřební funkce na ČR / Application of consumption function on CR

Poncar, Jaroslav January 2017 (has links)
Consumer function is a standard instrument of quantitative economic analysis to examine the relationship between consumer expenditure and income or other influencing factors such as liquid assets, interest rates or various demographic and social factors. In this thesis are presented the most frequently used methods in econometric analysis of consumption function. Attention is paid to the hypothesis of absolute income, relative income, life cycle, permanent income, rational expectations and consumption function based on the error correction model. Furthermore, the suitability of individual models for the current economic situation in the Czech Republic is assessed. Subsequently an empirical model of consumption function for the Czech Republic is designed and tested. Furthermore, the estimates of each consumption function model for the period before and after economic crisis of 2008-2009 are performed and compared. Finally, a short-term prediction of the consumption of Czech households is made.
555

An asymmetric econometric model of the South African stock market

Moolman, Helena Cornelia 19 April 2004 (has links)
In this study a structural model of the South African stock market, the Johannesburg Stock Exchange (JSE), was developed and estimated econometrically. The study has made three important contributions to the literature. Firstly, a structural model of the South African stock market has been developed, which quantifies the relationships between the stock market and macroeconomic variables while analyzing the impact of foreign markets and phenomena such as contagion, policy changes and structural economic changes on the JSE. This will improve the economic agents’ understanding of the functioning of the stock market and potentially assist in forecasting the stock market. Secondly, investors are generally assumed to be risk and/or loss averse. This study explains how this risk and/or loss aversion of investors can cause asymmetry in stock prices and the study evaluates different types of stock market asymmetry with advanced econometric techniques such as the threshold cointegration test of Siklos and Enders (2001) and a Markov switching regime model. The Markov switching regime model is used to model the South African business cycle and to construct an indicator for the state of the business cycle, which is in turn used to introduce cyclical asymmetry in the stock market model. The Markov switching regime model is in itself a substantial contribution to the literature since no Markov switching regime model has been estimated for the South African business cycle yet. Apart from being used to capture cyclical asymmetry in the stock market, the Markov switching regime business cycle model can also be used to identify turning points in the South African economy and to model economic growth. Finally, the forecasting performance of the stock market model developed in this study is compared to other stock market models. According to the results, this model is preferred to the other stock market models in terms of modelling and forecasting the level and direction of the JSE. This means that investors and policy markets can use this model to simulate the impact of changes in macroeconomic indicators on the future course of the stock market and use it to develop profitable trading rules. / Thesis (PhD (Econometrics))--University of Pretoria, 2005. / Economics / unrestricted
556

South African money market volatility, asymmetry and retail interest pass-through

Fadiran, Gideon Oluwatobi January 2011 (has links)
The purpose of this paper is to examine the interest rate transmission mechanism for South Africa as an emerging economy in a pre-repo and repo system. It explains how the money market rate is transmitted to the retail interest rates both in the long-run and short-run and tests the symmetric and asymmetric interest rate pass-through using the Scholnick (1996) ECM and the Wang and Lee (2009) ECM-EGARCH (1, 1)-M methodology. This permitted the examination of the impact of interest rate volatility, along with the leverage effect. An incomplete pass-through is found in the short-run. From the entire sample period, a symmetric adjustment is found in the deposit rate, which had upward rigidity adjustment, while an asymmetric adjustment is found in the lending rate, with a downward rigidity adjustment. All the adjustments supported the collusive pricing arrangements. According to the conditional variance estimation of the ECM-EGARCH (1, 1), negative volatility impact and leverage effect are present and influential only in the deposit interest rate adjustment process in South Africa.
557

Model closure and price formation under switching grain market regimes in South Africa

Meyer, Ferdinand 08 December 2006 (has links)
This study develops the structure and closure of an econometric regime-switching model within a partial equilibrium framework that has the ability to generate reliable estimates and projections of endogenous variables under market-switching regimes. Models used in policy evaluation usually either ignore the possibility of regime switching, using just a single method of price determination based on average effects, or incorporate highly stylised components that may not reflect the complexities of a particular market. This study proposes an approach that allows the incorporation of features of regime switching in a multisector commodity level model which capture salient features of the South African market and are therefore able to produce more reliable projections of the evolution of the sector under alternative shocks. The following hypothesis is tested in the study: With the correct model structure and closure, a combination of modelling techniques can be applied to develop a simulation model that has the ability to generate reliable estimates and projections of endogenous variables under market-switching regimes. The technique that is used to “close” a simultaneous or recursive simulation model determines the manner in which market equilibrium is achieved in the model. The choice of closure technique will depend on the equilibrium pricing condition in a specific market, specifically which market regime prevails in the market. It is important to note that trade flow and equilibrium pricing conditions under various trade regimes in the SA grain markets do not occur strictly according to these definitions. In the SA white and yellow maize markets some level of trade does occur with neighbouring countries at price levels that suggest that the market is trading under a type of regional autarky isolated from world markets. Industry experts argue that trade in the Southern African region is largely driven by regional issues like staple food, adverse weather conditions, location and quality concerns of genetically modified imported maize from non-African destinations, and to a lesser extent by arbitrage opportunities. This study, therefore, refers to “near-autarky”. Given the fact that markets can fluctuate between different trade regimes (therefore equilibrium pricing conditions), some type of regime-switching model needs to be utilised to determine model closure. A switching mechanism is introduced that allows the white maize model to switch between model closer under import parity, near-autarky, and export parity, the yellow maize model to switch between model closure under import parity and near-autarky, and the wheat model to close under import parity. Various approaches are used to test whether the regime-switching model complies with the hypothesis of this study. The first approach involves the simulation of baseline projections under a combination of different trade regimes in the grain markets. The second approach illustrates the usefulness of the automated switch between the various model closure techniques by comparing ex-post simulation results of the regime-switching model to the results of a previous version of the sector model that does not have the ability to switch between various market regimes. The last approach presents a more hands-on application of the regime-switching model to real-life examples by analysing the impact of a combination of market- and policy-related shocks in the form of scenario analysis. This study proves that the regime-switching model is able to capture a richer variety of market behaviour than standard models as a result of the regime-switching innovation outlined, therefore more accurately capturing the likely effects of shocks on the domestic market. It is therefore consistent with the hypothesis of this study. The regime-switching model is, by design, more rigorous than the previous model in that it emphasises price formation and correct model closure under alternative regimes. Although the model is particularly appropriate for the South African grain market as specified here, it provides a template for which models for other countries and commodities may be developed. / Thesis (PhD (Agricultural Economics))--University of Pretoria, 2006. / Agricultural Economics, Extension and Rural Development / unrestricted
558

The relevance and fairness of the JSE ALTX PRE-IPO share pricing methodologies

Magliolo, Jacques January 2012 (has links)
This three year indepth study was prompted after a decade of working as a corporate advisor for numerous stockbroking firms' corporate advisory and listing divisions. An overwhelming lack of discernible pricing methodology for IPOs on the JSE's Main Board and failed Venture Capital and Development Capital Markets was transferred to the new Alternative Exchange (AltX). This prompted lengthly discussions with former head of JSE's AltX Noah Greenhill. Such discussions are set out in this dissertation and relate to pricing methodologies and the lack of guidance or legislation as set out in the JSE's schedule 21 of Listing requirements. The focus of this dissertation is thus centred on whether the current adopted methodologies to establish a fair and reasonable pre-IPO share price is effective. To achieve this, global pricing methodologies were assessed within the framework of various valuation techniques used by South African Designated Advisors.
559

Profit incentives and technical efficiency in the provision of health care in Zimbabwe: an application of data envelopment analysis and econometric methods

Maredza, Andrew January 2009 (has links)
This study examines issues surrounding efficiency in the Zimbabwean health sector with specific emphasis on for-profit hospitals in order to find out whether they are significantly more efficient than non-profit hospitals. The study attempts to explore the significance of profit incentives on efficiency. This study uses the Data Envelopment Analysis (DEA) methodology to examine hospital efficiency scores for the 100 hospitals in the sample classified as for-profit, mission and public. Outputs of the study include inpatient days and outpatient visits. The number of beds, doctors and nurses were used to capture hospital inputs. The findings indicated that there was a marked deviation of efficiency scores from the best practice frontier with for-profit hospitals having the highest mean PTE of 71.1 percent. The mean PTE scores for mission and public hospitals were 64.8 percent and 62.6 percent respectively. About 85 percent, 83 percent and 91 percent of the for-profit, mission and public hospitals were found to be operating below their average PTE. More than half of the hospitals are being run inefficiently. Of more importance to this study is the fact that the hypothesis of for-profit hospital superiority was accepted implying that for profit hospitals are significantly more efficient than the non-profit category. The study indicated that the amount of inputs being used could be decreased substantially without decreasing the quantity of outputs achieved. In each of the hospitals included in the study, the total input reductions needed to make inefficient hospitals efficient are more than 50 percent. These input savings could go a long way in achieving other health concerns without mobilizing additional resources in the sector
560

A cox proportional hazard model for mid-point imputed interval censored data

Gwaze, Arnold Rumosa January 2011 (has links)
There has been an increasing interest in survival analysis with interval-censored data, where the event of interest (such as infection with a disease) is not observed exactly but only known to happen between two examination times. However, because so much research has been focused on right-censored data, so many statistical tests and techniques are available for right-censoring methods, hence interval-censoring methods are not as abundant as those for right-censored data. In this study, right-censoring methods are used to fit a proportional hazards model to some interval-censored data. Transformation of the interval-censored observations was done using a method called mid-point imputation, a method which assumes that an event occurs at some midpoint of its recorded interval. Results obtained gave conservative regression estimates but a comparison with the conventional methods showed that the estimates were not significantly different. However, the censoring mechanism and interval lengths should be given serious consideration before deciding on using mid-point imputation on interval-censored data.

Page generated in 0.0435 seconds