31 |
Prudential banking regulation and monetary policyLi, Lianfa 19 July 2004 (has links)
No description available.
|
32 |
THE CHANGES AND CONTRIBUTORS TO THE PERCEPTION OF RETIREMENT INCOME ADEQUACYHUYNH, HELEN T January 2019 (has links)
This study examines the contributing factors that influence the variation in the perception of retirement income adequacy, analyzing U.S. households who are in the nearing retirement stage (age 55-64) in 2007 along with U.S. households who are in the first stage of retirement (age 65-74) in 2016, within the Survey of Consumer Finances. This study performs pair-matching or one-to-one propensity score matching of the cohort nearing retirement in the 2007 SCF dataset with those of the first stage of retirement in the 2016 SCF dataset who share a similar value of the propensity score. Statistically, these are the same households nine years later. Results show that of those households in the near retirement stage, 55.80 percent perceive their retirement to be adequate, while 72.20 percent of those in the first stage of retirement perceive their retirement income to be adequate. This study examined the effect of sociodemographic characteristic, human capital, financial capital, and financial attitude predictors on the perceived adequacy of retirement income. Probit analysis showed retired (first stage of retirement) respondents who were married, white, headed by male, college educated, had sufficient net worth and income, were homeowners, in good health, had saved, took financial risks, had a longterm financial planning horizon, received or expected to receive an inheritance had a defined benefit pension plan, and/or owned a defined contribution plan were more likely to perceive their retirement income to be adequate than otherwise similar households. / Business Administration/Risk Management and Insurance
|
33 |
Abbeville v. the State of South Carolina: A Case StudyWeiler, Spencer C. 24 April 2007 (has links)
Abbeville v. the State of South Carolina (2005) is the latest lawsuit in a long line of cases addressing school finance issues that originated with Brown v. the Board of Education (1954), Serrano v. Priest (1971), and the San Antonio Independent School District v. Rodriquez (1973). Unlike many of the other school finance cases that have been adjudicated, Abbeville has not been the subject of much academic scrutiny. This case study documented Abbeville's origins in an effort to begin the process of academic examination and understanding. To document the inception of this case, five research questions were developed to guide the efforts. These five research questions were: 1) What political and economic conditions were present in South Carolina in the early 1990s that led to the decision to file the lawsuit?; 2) How were the eight lead school districts selected to be a part of the plaintiffs' case?; 3) What legal arguments did both the plaintiffs and defendants use in Abbeville?; 4) Why did the state choose to contest the lawsuit?; and 5) What was the 2005 ruling in the Abbeville case and how did people closely associated with the case react to the decision? The data used to answer these research questions included analysis of primary documents and eighteen qualitative interviews. The primary documents included the state constitution, current legislation in South Carolina affecting public education, previous school finance oriented court cases in South Carolina, and student achievement data. The eighteen participants in this study all shared a high degree of familiarity with Abbeville. Eleven were directly involved in the case (testified, heard and/or made legal arguments), four were deposed, and the remaining three followed the case closely. The credibility of this study increased through the use of triangulation, or the use of multiple data sources related to an issue of uncertainty, which produced the conclusions to the study found at the end of this document. As a result of the data collected, conclusions related to Abbeville are presented along with a discussion on the implications of this study. There are also suggestions for future studies. / Ph. D.
|
34 |
Macroeconomic Forecasting: Statistically Adequate, Temporal Principal ComponentsDorazio, Brian Arthur 05 June 2023 (has links)
The main goal of this dissertation is to expand upon the use of Principal Component Analysis (PCA) in macroeconomic forecasting, particularly in cases where traditional principal components fail to account for all of the systematic information making up common macroeconomic and financial indicators. At the outset, PCA is viewed as a statistical model derived from the reparameterization of the Multivariate Normal model in Spanos (1986). To motivate a PCA forecasting framework prioritizing sound model assumptions, it is demonstrated, through simulation experiments, that model mis-specification erodes reliability of inferences. The Vector Autoregressive (VAR) model at the center of these simulations allows for the Markov (temporal) dependence inherent in macroeconomic data and serves as the basis for extending conventional PCA. Stemming from the relationship between PCA and the VAR model, an operational out-of-sample forecasting methodology is prescribed incorporating statistically adequate, temporal principal components, i.e. principal components which capture not only Markov dependence, but all of the other, relevant information in the original series. The macroeconomic forecasts produced from applying this framework to several, common macroeconomic indicators are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons. / Doctor of Philosophy / The landscape of macroeconomic forecasting and nowcasting has shifted drastically in the advent of big data. Armed with significant growth in computational power and data collection resources, economists have augmented their arsenal of statistical tools to include those which can produce reliable results in big data environments. At the forefront of such tools is Principal Component Analysis (PCA), a method which reduces the number of predictors into a few factors containing the majority of the variation making up the original data series. This dissertation expands upon the use of PCA in the forecasting of key, macroeconomic indicators, particularly in instances where traditional principal components fail to account for all of the systematic information comprising the data. Ultimately, a forecasting methodology which incorporates temporal principal components, ones capable of capturing both time dependence as well as the other, relevant information in the original series, is established. In the final analysis, the methodology is applied to several, common macroeconomic and financial indicators. The forecasts produced using this framework are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons.
|
35 |
A Retrospective View of the Phillips Curve and Its Empirical Validity since the 1950sDo, Hoang-Phuong 07 May 2021 (has links)
Since the 1960s, the Phillips curve has survived various significant changes (Kuhnian paradigm shifts) in macroeconomic theory and generated endless controversies. This dissertation revisits several important, representative papers throughout the curve's four historical, formative periods: Phillips' foundational paper in 1958, the wage determination literature in the 1960s, the expectations-augmented Phillips curve in the 1970s, and the latest New Keynesian iteration. The purpose is to provide a retrospective evaluation of the curve's empirical evidence. In each period, the preeminent role of the theoretical considerations over statistical learning from the data is first explored. To further appraise the trustworthiness of empirical evidence, a few key empirical models are then selected and evaluated for their statistical adequacy, which refers to the validity of the probabilistic assumptions comprising the statistical models. The evaluation results, using the historical (vintage) data in the first three periods and the modern data in the final one, show that nearly all of the models in the appraisal are misspecified - at least one probabilistic assumption is not valid. The statistically adequate models produced from the respecification with the same data suggest new understandings of the main variables' behaviors. The dissertations' findings from the representative papers cast doubt on the traditional narrative of the Phillips curve, which the representative papers play a crucial role in establishing. / Doctor of Philosophy / The empirical regularity of the Phillips curve, which captures the inverse relationship between the inflation and unemployment rates, has been widely debated in academic economic research and between policymakers in the last 60 years. To shed light on the debate, this dissertation examines a selected list of influential, representative studies from the Phillips curves' empirical history through its four formative periods. The examinations of these papers are conducted as a blend between a discussion on the methodology of econometrics (the primary quantitative method in economics), the role of theory vs. statistical learning from the observed data, and evaluations of the validity of the probabilistic assumptions assumed behind the empirical models. The main contention is that any departure of probabilistic assumptions produces unreliable statistical inference, rendering the empirical analysis untrustworthy. The evaluation results show that nearly all of the models in the appraisal are untrustworthy - at least one assumption is not valid. Then, an attempt to produce improved empirical models is made to produce new understandings. Overall, the dissertation's findings cast doubt on the traditional narrative of the Phillips curve, which the representative papers play a crucial role in establishing.
|
36 |
Towards A Sufficient Set of Mutation Operators for Structured Query Language (SQL)McCormick II, Donald W. 25 May 2010 (has links)
Test suites for database applications depend on adequate test data and real-world test faults for success. An automated tool is available that quantifies test data coverage for database queries written in SQL. An automated tool is also available that mimics real-world faults by mutating SQL, however tests have revealed that these simulated faults do not completely represent real-world faults. This paper demonstrates how half of the mutation operators used by the SQL mutation tool in real-world test suites generated significantly lower detection scores than those from research test suites. Three revised mutation operators are introduced that improve detection scores and contribute toward re-defining a sufficient set of mutation operators for SQL. Finally, a procedure is presented that reduces the test burden by automatically comparing SQL mutants with their original queries. / Master of Science
|
37 |
Essays on DSGE Models and Bayesian EstimationKim, Jae-yoon 11 June 2018 (has links)
This thesis explores the theory and practice of sovereignty. I begin with a conceptual analysis of sovereignty, examining its theological roots in contrast with its later influence in contestations over political authority. Theological debates surrounding God’s sovereignty dealt not with the question of legitimacy, which would become important for political sovereignty, but instead with the limits of his ability. Read as an ontological capacity, sovereignty is coterminous with an existent’s activity in the world. As lived, this capacity is regularly limited by the ways in which space is produced via its representations, its symbols, and its practices. All collective appropriations of space have a nomos that characterizes their practice. Foucault’s account of “biopolitics” provides an account of how contemporary materiality is distributed, an account that can be supplemented by sociological typologies of how city space is typically produced. The collective biopolitical distribution of space expands the range of practices that representationally legibilize activity in the world, thereby expanding the conceptual limits of existents and what it means for them to act up to the borders of their capacity, i.e., to practice sovereignty. The desire for total authorial capacity expresses itself in relations of domination and subordination that never erase the fundamental precarity of subjects, even as these expressions seek to disguise it. I conclude with a close reading of narratives recounting the lives of residents in Chicago’s Englewood, reading their activity as practices of sovereignty which manifest variously as they master and produce space. / Ph. D. / For an empirical analysis the statistical model implied in the theoretical model is crucial. The statistical model is simply the set of probabilistic assumptions imposed on the data, and invalid probabilistic assumptions undermines the reliability of statistical inference, rendering the empirical analysis untrustworthy. Hence, for securing trustworthy evidence one should always validate the implicit statistical model before drawing any empirical result from a theoretical model. This perspective is used to shed light on a widely used category of macroeconometric models known as Dynamic Stochastic General Equilibrium (DSGE) Models. Using U.S. time-series data, the paper demonstrates that a widely used econometric model for the U.S. economy is severely statistically misspecified; almost all of its probabilistic assumptions are invalid for the data. The paper proceeds to respecify the implicit statistical model behind the theoretical model with a view to secure its statistical adequacy (validity of its probabilistic assumptions). Using the respecified statistical model, the paper calls into question the literature evaluating the theoretical adequacy of current DSGE models, ignoring the fact that such evaluations are untrustworthy because they are based on statistically unreliable procedures.
|
38 |
Regelverket Basel : Övergången från Basel II till Basel III utifrån bankernas perspektivKaraca, Deniz, Ghaderi, Mohsen January 2013 (has links)
Research issue: The transition from Basel II to Basel III becomes consuming for banks, financially. But Basel III should be profitably for financial market economy. Risks in the financial world is very complex. Is Basel III is sufficient to manage risk and future crises Purpose: The purpose of this paper is to examine the application of Basel II and the transition to Basel III in Sweden with the banking system in focus. Method: The study has a qualitative research methodology for the collection of empirical data. The study is based on interviews with four large banks of Sweden (Swedbank, SEB, Nordea, Handelsbanken) and with Finansinspektionen. We also used previous studies, books and rapports. Conclusions: Basel has no direct connection to the profitability of the banks. The translation to Basel III was an obvious step for a more stable financial market. With Basel III it became more expensive for the banks; the more cost the less returns and hence led dividends for shareholders. But the banks will not bear the costs themselves, the costumers will get affected. Banks have begun to adapt to Basel III. There are requirements to save equity immediately not only in crisis. Which leads to the return is not likely to be lowered at bad times.
|
39 |
Regulatory Design of Capacity Remuneration Mechanisms in Regional and Low-Carbon Electric Power MarketsMastropietro, Paolo January 2016 (has links)
Capacity remuneration mechanisms (CRMs) are “climbing” regulatory agendas in all liberalised power sectors, especially in the European Union. CRMs are introduced to improve system reliability and to minimise power shortages to an economically efficient extent. These schemes will have a central role in future power systems. This PhD thesis provides an in-depth review of CRM design elements and recommendations to increase their efficiency and effectiveness, particularly in view of the challenges that these mechanisms have to confront in the current power sector environment, characterised by the pursuit of decarbonisation. The attention is focused here on the interaction with regional market integration, the need for properly-designed performance incentives, and the interaction with renewable technologies. The research is based on empirical evidence collected from international experiences, which is complemented, where applicable, by a model-based analysis to examine specific design elements. The outcomes of this PhD thesis can be summarised as follows. The participation of cross-border resources in national CRMs must be guaranteed in order to fully seize the benefits of regional market integration. However, this participation requires a strong commitment from power systems (and governments) in the regional market and the implementation of network codes and market rules that deter system operators from blocking exports when the latter are the outcome of an efficient market clearing. Where short-term markets are coordinated through market coupling, the algorithm must include a conditional nomination rule that ensures that, during regional scarcity conditions, available resources are assigned to those consumers that paid for them in the CRM market. CRMs must rely on robust performance incentives that foster the actual delivery of the committed capacity. High penalty rates may increase the cost of the capacity market, but the overall cost of electricity supply may decrease. Renewable technologies should be allowed to participate in CRMs and should be exposed to the market signals provided by these mechanisms. If renewable and conventional technologies must compete in the same markets, they should do it subject to the same rules. Obviously this participation must be coordinated with renewable support schemes, discounting CRM revenues. / <p>QC 20160411</p>
|
40 |
Bank capitalization and credit rating assessment : Evidence from the EBA stress testDimitrova, Evgenia January 2016 (has links)
Banks face market pressure when determining their capital structures because they are subject to strict regulations. CFOs are willing to adjust their company’s capital structures in order to obtain higher ratings. The credit ratings are highly valuable not only because they assess the creditworthiness of the borrowers but also because those agencies take advantage of the information asymmetry and have access to data that companies might not disclose publicly. Also, this industry gained much interest after the BIS proposals back in 1999 and 2001 that the Basel Committee on Banking Supervision should consider the borrower’s credit ratings when examining banks’ solvency and adequacy. Factors used to determine the credit ratings are banks’ asset quality which is fundamental measure for the creditworthiness, banks’ capital which is related to the asset quality in relation to the RWA, banks’ profitability, and liquidity measurements. The purpose of this paper is to investigate whether the banks that keep excess equity to balance sheet receive better credit ratings, given the predictors capital, banks size and defaulted to total exposures. The European Banking Authority (EBA) stress test results are used as a benchmark for determining banks’ capital adequacy and solvency, whereas the credit ratings are obtained shortly after the EBA’s reports publication. The sample size is 73 and 95 banks for the years 2011 and 2014, respectively. The results from the multivariate ordinal regression do not show significant correlation results between the excess equity to balance sheet and the credit ratings, even though the estimated coefficient is negative, namely excess equity is associated with lower credit ratings. An explanation to this one can find in the low-quality capital relative to the banks’ capital base. Also, banks which plan to implement risker projects or currently hold risker assets are subject to higher capital requirements. Moreover, banks currently being rated low but with the potential of being upgraded would be more willing to issue equity than debt in order to avoid the corresponding risk and achieve the higher rating. The equity ratio and the defaulted to total exposures ratio show significant correlation to the banks’ credit ratings. Overall, since the results of the regression are insignificant, we do not have reasons to believe that holding excess equity is not beneficial for banks. When banks make changes in their leverage ratios they would either carry the cost of being downgraded or the cost related to issuing more equity, therefore at the end they will balance the leverage ratio close to the optimal and keep as much capital as required by regulations.
|
Page generated in 0.0242 seconds