81 |
Essays on Currency CrisesKarimi Zarkani, Mohammad January 2012 (has links)
(None) Technical Summary of Thesis:
The topic of my thesis is currency crisis. Currency crises have been a recurrent feature of the international economy from the invention of paper money. They are not confined to particular economies or specific region. They take place in developed, emerging, and developing countries and are spread all over the globe. Countries that experience currency crises face economic losses that can be huge and disruptive. However, the exacted toll is not only financial and economic, but also human, social, and political. It is clear that the currency crisis is a real threat to financial stability and economic prosperity.
The main objective of this thesis is to analyze the determinants of currency crises for twenty OECD countries and South Africa from 1970 through 1998. It systematically examines the role of economic fundamentals and contagion in the origins of currency crises and empirically attempts to identify the channels through which the crises are being transmitted. It also examines the links between the incidence of currency crises and the choice of exchange rate regimes as well as the impact of capital market liberalization policies on the occurrence of currency crises.
The first chapter identifies the episodes of currency crisis in our data set. Determining true crisis periods is a vital step in the empirical studies and has direct impact on the reliability of their estimations and the relevant policy implications. We define a period as a crisis episode when the Exchange Market Pressure (EMP) index, which consists of changes in exchange rates, reserves, and interest rates, exceeds a threshold. In order to minimize the concerns regarding the accuracy of identified crisis episodes, we apply extreme value theory, which is a more objective approach compared to other methods. In this chapter, we also select the reference country, which a country’s currency pressure index should be built around, in a more systematic way rather than by arbitrary choice or descriptive reasoning.
The second chapter studies the probability of a currency exiting a tranquil state into a crisis state. There is an extensive literature on currency crises that empirically evaluate the roots and causes of the crises. Despite the interesting results of the current empirical literature, only very few of them account for the influence of time on the probability of crises. We use duration models that rigorously incorporate the time factor into the likelihood functions and allow us to investigate how the amount of time that a currency has already spent in the tranquil state affects the stability of a currency. Our findings show that high values of volatility of unemployment rates, inflation rates, contagion factors (which mostly work through trade channels), unemployment rates, real effective exchange rate, trade openness, and size of economy increases the hazard of a crisis. We make use of several robustness checks, including running our models on two different crisis episodes sets that are identified based on monthly and quarterly type spells.
The third chapter examines the links between the incidence of currency crises and the choice of exchange rate regimes as well as the impact of capital market liberalization policies on the occurrence of currency crises. As in our previous paper, duration analysis is our methodology to study the probability of a currency crisis occurrence under different exchange rate regimes and capital mobility policies. The third chapter finds that there is a significant link between the choice of exchange rate regime and the incidence of currency crises in our sample. Nevertheless, the results are sensitive to the choice of the de facto exchange rate system. Moreover, in our sample, capital control policies appear to be helpful in preventing low duration currency crises. The results are robust to a wide variety of sample and models checks.
|
82 |
Statistical Inference for Heavy Tailed Time Series and VectorsTong, Zhigang January 2017 (has links)
In this thesis we deal with statistical inference related to extreme value phenomena.
Specifically, if X is a random vector with values in d-dimensional space, our goal is
to estimate moments of ψ(X) for a suitably chosen function ψ when the magnitude
of X is big. We employ the powerful tool of regular variation for random variables,
random vectors and time series to formally define the limiting quantities of interests
and construct the estimators. We focus on three statistical estimation problems: (i)
multivariate tail estimation for regularly varying random vectors, (ii) extremogram
estimation for regularly varying time series, (iii) estimation of the expected shortfall
given an extreme component under a conditional extreme value model. We establish asymptotic normality of estimators for each of the estimation problems. The theoretical findings are supported by simulation studies and the estimation procedures are applied to some financial data.
|
83 |
Analyse et gestion du risque extrême sur le marché du maïs / Analysis and management of extreme risk in the corn marketElbouazizi, Saïd 18 December 2014 (has links)
Depuis le début de la décennie 2000, le marché du maïs connaît un changement profond. D'une part, le prix enregistre une volatilité extrême sans précédent. D'autre part, ce marché bénéficie d'un déferlement massif des investisseurs financiers. Il offre des opportunités d'investissements financiers rentables en raison des crises récurrentes sur le marché boursier. Il est intéressant pour des investisseurs (spéculateurs, fondamentalistes, arbitragistes) d'avoir connaissance des résultats d'analyse des variations extrêmes du prix du marché du maïs. La maîtrise des variations extrêmes du prix permet une meilleure gestion du risque. Des études ont déjà été menées dans cette direction en utilisant des techniques du type « VaR ». Cependant, les différents modèles de gestion du risque par la VaR souffrent de certaines limites. Ils supposent l'hypothèse de la normalité des distributions. Or, la distribution des rendements du maïs montre des valeurs extrêmes. Cela ne permet pas une bonne appréciation du risque. Afin de contribuer à l'analyse des variations extrêmes de prix sur le marché du maïs, nous faisons appel aux modèles GARCH et à la théorie des valeurs extrêmes. Puis, dans un cadre multi-varié, le lien entre rendements spots et futures exprime le degré de la dépendance. Il permet ainsi d'analyser l'effet de la spéculation. Pour cela, nous utilisons la théorie des valeurs extrêmes couplée à la mesure de la dépendance qu'on appelle « copule » pour cerner les mouvements extrêmes des variations du prix au delà d'un seuil. En effet, la théorie des copules propose toute une gamme de fonctions capable de mesurer la dépendance asymétrique aux queues de la distribution des rendements spots et futures du maïs. / Since the early 2000s, the corn market is undergoing a profound change. On the one hand, the price has experienced unprecedented extreme volatility. Moreover, this market has a massive outpouring of financial investors. The corn market offers profitable financial investments due to recurrent crises in the stock market opportunities. It is interesting for investors (speculators, fundamentalists, arbitrageurs) to be aware of the analysis of extreme price changes in corn results. The mastery of extreme price changes provides better risk management. Studies have already been conducted in this direction by using techniques such as "VaR". However, the different models of risk management VaR suffer from certain limitations. They assume the assumption of normality of distributions. However, the distribution of return corn shows extreme values. This does not allow a proper assessment of risk. To contribute to the analysis of extreme price changes in the corn market, we use the GARCH models and the theory of extreme values. Then, in a multi-varied context, the link between returns and future spots expresses the degree of dependence. It allows analyzing the effect of speculation. We use extreme value theory coupled to the measure of dependence called "copula" to identify extreme movements of price changes beyond a threshold. Indeed, copula theory offers a range of features that can measure the asymmetric dependence tails of the distribution of spot return and futures of corn.
|
84 |
Analyse régionale des aléas maritimes extrêmes / Regional frequency analysis of extreme marine hazardsWeiss, Jérôme 07 November 2014 (has links)
Connaître la probabilité d'occurrence des aléas océano-météorologiques extrêmes est fondamental pour prévenir les risques de submersion marine en zone côtière ou concevoir des aménagements côtiers, portuaires ou des plate-formes offshore. Notamment, le concept de niveau de retour est fréquemment utilisé en ingénierie côtière pour dimensionner des ouvrages de protection. Ces niveaux, dont les périodes de retour d'intérêt se situent généralement entre 100 et 1000 ans, sont habituellement estimés par une analyse statistique locale, à partir de données observées en un site unique. Cependant, la période d'observation est généralement limitée, de sorte que les incertitudes associées aux niveaux de retour élevés sont importantes. L'analyse régionale représente une solution possible pour réduire les incertitudes inhérentes aux analyses locales. Le principe est d'exploiter l'information de sites d'observation provenant d'une région homogène, où les extrêmes sont supposés avoir un comportement probabiliste similaire. L'analyse régionale peut ainsi estimer les niveaux de retour de manière plus fiable qu'une analyse locale. Cependant, son application dans le domaine maritime étant relativement limitée et récente, différentes questions méthodologiques de meurent non-Résolues, comme la formation des régions homogènes ou le traitement de la dépendance entre sites. L'objectif scientifique de la thèse est donc d'approfondir certains points méthodologiques de l'analyse régionale, dans le cadre des aléas maritimes extrêmes. Les points suivants sont abordés en particulier :• Échantillonnage des extrêmes pour l'analyse régionale, à partir des tempêtes détectées via une procédure de declustering spatio-Temporel.• Formation de régions homogènes à partir d'une méthode basée sur l'identification des empreintes typiques des tempêtes.• Prise en compte de la dépendance entre sites d'observation, à travers la construction d'un modèle permettant par exemple d'évaluer la durée effective régionale d'observation ou la période de retour régionale d'une tempête.• Spécification et estimation de la loi régionale, avec incorporation des co-variables influentes, comme la saison d'occurrence ou la direction de provenance pour les vagues.• Comparaison entre analyses locale et régionale, notamment à travers les incertitudes sur les estimations des extrêmes et la capacité à modéliser les horsains présumés.Ces aspects sont illustrés sur des données de hauteurs significatives de vagues et de surcotes de pleine mer, dans la zone Atlantique Nord-Est, Manche et Mer du Nord.Parallèlement, l'objectif applicatif de ces travaux est de contribuer à garantir la sûreté des ouvrages EDF contre le risque de submersion marine. Ceci peut être réalisé grâce à l'exploration de nouvelles techniques d'estimation des aléas maritimes extrêmes telles que l'analyse régionale, qui permet notamment une meilleure prise en compte des horsains. / The knowledge of the probability of occurrence of oceano-Meteorological extremes is essential to prevent risks of coastal flooding or to build coastal protections or off-Shore structures. In particular, the concept of return level is frequently used in coastal engineering to design protection structures. These levels, whose return periods of interest generally lie between 100 and 1000 years, are usually estimated by a local statistical analysis, from data observed at a unique site. However, the period of observation is generally limited, which can imply high uncertainties for high return levels. Regional frequency analysis is a possible solution to reduce uncertainties inherent to local analyses. The principle is to exploit the information of sites of observation from a homogeneous region, where extremes are supposed to share a similar probabilistic behavior. Thus, regional frequency analysis can estimate return levels more accurately than a local analysis. However, its application to the marine field being relatively limited and recent, several methodological questions are still unsolved, such as the formation of homogeneous regions or the dependence between sites. The scientific objective of this thesis is thus to develop some methodological points of regional frequency analysis, in the framework of extreme marine hazards. The following questions are tackled:• Sampling of extremes for regional analysis, from the storms detected through a spatiotemporal declustering procedure.• Formation of homogeneous regions from a method based on the identification of the typical storms footprints.• Consideration of the dependence between sites of observation, through the building of a model allowing, for example, to assess the regional effective duration or the regional return period of a storm.• Specification and estimation of the regional distribution, with the incorporation of influent covariables, such as the season of occurrence or the direction for waves.• Comparison between regional and local analyses, especially through the uncertainties on the estimated extremes and the ability to model the potential outliers. These aspects are illustrated on significant wave height data and skew surge data located in the Northeast Atlantic, the Eastern Channel and the North Sea. At the same time, the industrial objective of this work is to contribute to guarantee the safety of EDF structures against the risk of coastal flooding. This can be achieved through the exploration of new techniques of estimation of extreme marine hazards such as regional frequency analysis, which allows in particular a better representation of outliers
|
85 |
A study on the theoretical predictability of extreme value distributions for natural catastrophic events / Studie teoretické predikovatelnosti extremálních rozdělení pro přírodní katastrofySabolová, Radka January 2013 (has links)
The thesis deals with natural disasters from the statistical point of view and treats them as extremal observations. Basics of classical extreme value theory will be summarized and new approach based on maximum entropy principle will be proposed. Both methods will be used in order to analyze real discharge data observed at the river Vltava.
|
86 |
New statistical models for extreme valuesEljabri, Sumaya Saleh M. January 2013 (has links)
Extreme value theory (EVT) has wide applicability in several areas like hydrology, engineering, science and finance. Across the world, we can see the disruptive effects of flooding, due to heavy rains or storms. Many countries in the world are suffering from natural disasters like heavy rains, storms, floods, and also higher temperatures leading to desertification. One of the best known extraordinary natural disasters is the 1931 Huang He flood, which led to around 4 millions deaths in China; these were a series of floods between Jul and Nov in 1931 in the Huang He river.Several publications are focused on how to find the best model for these events, and to predict the behaviour of these events. Normal, log-normal, Gumbel, Weibull, Pearson type, 4-parameter Kappa, Wakeby and GEV distributions are presented as statistical models for extreme events. However, GEV and GP distributions seem to be the most widely used models for extreme events. In spite of that, these models have been misused as models for extreme values in many areas.The aim of this dissertation is to create new modifications of univariate extreme value models.The modifications developed in this dissertation are divided into two parts: in the first part, we make generalisations of GEV and GP, referred to as the Kumaraswamy GEV and Kumaraswamy GP distributions. The major benefit of these models is their ability to fit the skewed data better than other models. The other idea in this study comes from Chen, which is presented in Proceedings of the International Conference on Computational Intelligence and Software Engineering, pp. 1-4. However, the cumulative and probability density functions for this distribution do not appear to be valid functions. The correction of this model is presented in chapter 6.The major problem in extreme event models is the ability of the model to fit tails of data. In chapter 7, the idea of the Chen model with the correction is combined with the GEV distribution to introduce a new model for extreme values referred to as new extreme value (NEV) distribution. It seems to be more flexible than the GEV distribution.
|
87 |
VALUES AND HEALTH CARE: THE RELATIONSHIP BETWEEN UNINSURED AND INSURED STATUS PURSUANT TO THE ROKEACH VALUE SURVEYLamb, Linda Carol 04 November 2010 (has links)
The health care industry is undergoing significant change, particularly with the passage of the Patient Protection and Affordable Care Act in March of 2010. Using the Rokeach Value survey, the value priorities of the insured and uninsured respondents were assessed. The value priorities were also evaluated for the demographics of gender, ethnicity, education, income, and age group. The terminal value of health was culled out as a moderator variable relative to its influence in the decision to seek health care coverage.
The most significant contribution of this study reveals an increased understanding of consumer value preferences and demographics and their influence on health care coverage choices. In turn, it advances personal value theory in a health care context and its implications to behavior and decision-making. The results establish the role of health as a significant moderator variable in the decision process.
These findings reveal a multitude of insights, not only for the academic researcher, but for practitioners and policymakers alike who are commissioned to execute the new health care reform bill over the next several years. As health care reform is implemented over the next several years, a combination of legislative and market-based solutions will be necessary to curtail the rising number of the uninsured, and ensure there are parity, equity, and morality in the distribution of health care for all Americans.
|
88 |
IKT i matematikundervisningen : Hur påverkar det elevers syn på sin motivation?Lundmark, Nea January 2018 (has links)
No description available.
|
89 |
Accelerated testing with application in financeOppel, Anel January 2016 (has links)
The event of a default for low-default portfolios, such as sovereign debt or banks, have received
much attention as a result of the increasing instabilities in financial markets. The lack
of sufficient default information on low-default portfolios complicates the protection of such
portfolios. Default protections have typically, in the past, relied on extreme value theory and
reporting the value at risk. The focus here, is the application of an engineering concept, accelerated
test techniques, to the problem of insufficient data on low-default portfolios. In the
application, high-default portfolios serve as stressed cases of low-default portfolios. Since
high-default portfolios have more data available, viewing it as a stressed case of a low-default
portfolio enables us to extrapolate the data to the low-default portfolio environment, and do
estimation such as estimating the default probability for a low-default portfolio. The flexible
framework through which the above is achieved, is provided. / Dissertation (MSc)--University of Pretoria, 2016. / Statistics / MSc / Unrestricted
|
90 |
Modeling multi-attribute utility theory with object-oriented programmingWang, Chen 12 January 2010 (has links)
System complexity has continued to increase with the development and application of new technologies. This increased complexity has created great concerns among people about the potential impact of a system on its ecological environment when considering such as plants, wildlife and clean air. A complete awareness of the potential impact requires a thorough understanding of how a system interacts with its ecological environment, and the results are dependent on the expertise of the engineer who is responsible for the design of the system and the analyst who evaluates the system Due to the complexity of these interactions and the difficulty in measuring the appropriate cause-and-effect relationships, a system's impact on its ecological environment has not received due attention.
The above complexity and difficulty have led to two deficiencies in the current research of the system's environmental impact. One is the insufficient evaluation of its qualitative attributes. The other is an unstructured evaluation process where the analyst has to rely on qualitative attributes as major inputs while his/her expertise could not be modeled. As a consequence, the current research and evaluation process is deficient because of biases and lack of clarity.
This report seeks to instill the necessary clarity into the decision-making process by structuring the decision maker's subjective knowledge. It is concluded that subjective preferences can be quantified and evaluated through utility function assessment. Alternatives are ranked and a final choice is made based on their utility. The modeling process described herein is made a lot more efficient and economical because of the computer software that integrates the assessment mechanisms into a user-friendly operational environment. After the deficiencies in the current evaluation process are identified, possible solutions are explored. The effectiveness of the Analytic Hierarchy Process (AHP), Multi-attribute Value Theory (MA VT), and Multi-attribute Utility Theory (MAUT) are compared. MAUT is the preferred approach based on solution requirements. / Master of Science
|
Page generated in 0.0988 seconds