1 |
Not about rules, but about good deals: The political economy of securing inclusive capital investment and transformation in South African miningNxele, Musawenkosi 11 September 2023 (has links) (PDF)
This PhD studies the imperative of racially transforming South Africa's economy in a way that spurs the growth of capital investment that is socially and locally inclusive. Part I explores the role of bargains among elites (“deals”) in facilitating investment. It studies deals as the basis of credible commitment and as the “arena of action” in the context of a relatively robust rule of law. What kind of deals produce capital investment and transformation, and what kind of deals produce predation and isomorphism? Using process tracing methodology, the research traces deal in platinum mining between 1994 and 2018. Part II examines the extent to which this investment is socially inclusive in alleviating local poverty, creating local employment, and reducing local inequality. This part relies on individual level census data of 20 million observations and geocoded mining data of over 400 mines to evaluate the local impact of mining investments on income poverty, employment, and inequality between 1996 and 2011. The study finds compelling evidence that “deals are the basis of credible commitment” to securing investment. The rule of law alone is important but insufficient as it leaves “residual uncertainty” for investors. The evaluation of the impact of mining investments on local communities suggests a qualification, at the local level, of the “resource curse” hypothesis. Mining brings benefits in terms of income poverty alleviation and employment. However, the high-low cycles of commodity price booms create employment volatility and exacerbate inequality. Mining investments inherently involve trade-offs that can be moved in net positive directions with good deals between business and the state, and local communities. The research thus contributes to the literature on property rights and investment, state-business relations and development, and natural resource governance for development.
|
2 |
Critical Events, Commitment, and the Probability of Civil WarDaxecker, Ursula E. 07 August 2008 (has links)
This dissertation investigates how political instability is related to the probability of civil war. According to the literature in comparative politics, regime breakdown is caused by critical events such as economic decline, defeat in interstate war, death of a leader in office, or changes is the international balance of power. Drawing on Powell (2004, 2006), I conceptualize such critical events as shifts in the domestic distribution of power that can lead to a bargaining breakdown and, in consequence, military conflict. Following a shock to government capabilities, current leaders and the opposition are bargaining for a share of authority. The government has incentives to grant concessions to other groups within the state, yet such promises are not credible given that the leadership may regain its strength. Similarly, opposition groups lack the ability to make credible commitments as they expect to be more powerful in the future. Both the government and opposition groups could benefit from striking bargains, but cannot credibly commit because of incentives to renege on agreements in the future. Unable to commit, both actors may use force to achieve their preferred outcome. The dissertation then shifts the focus to solutions to such commitment problems. I expect that (1) the institutional structure of government and opposition groups and (2) the distance between groups have important consequences on the range of feasible agreements during this bargaining process. The arguments are tested in a statistical study of all countries for the 1960-2004 time period and in a small-sample analysis of democratization processes in Algeria and Chile. Findings show that critical events increase the probability of civil war as hypothesized and empirical evidence also provides strong support for the proposed solutions to the commitment problem.
|
3 |
Credible Compilation *Rinard, Martin C. 01 1900 (has links)
This paper presents an approach to compiler correctness in which the compiler generates a proof that the transformed program correctly implements the input program. A simple proof checker can then verify that the program was compiled correctly. We call a compiler that produces such proofs a credible compiler, because it produces verifiable evidence that it is operating correctly. / Singapore-MIT Alliance (SMA)
|
4 |
The Effects of Third Party Observation on Credible and Non-credible Cognitive Performance: A Simulation StudyReese, Caitlin S. January 2011 (has links)
No description available.
|
5 |
Sensitivity Analyses for Tumor Growth ModelsMendis, Ruchini Dilinika 01 April 2019 (has links)
This study consists of the sensitivity analysis for two previously developed tumor growth models: Gompertz model and quotient model. The two models are considered in both continuous and discrete time. In continuous time, model parameters are estimated using least-square method, while in discrete time, the partial-sum method is used. Moreover, frequentist and Bayesian methods are used to construct confidence intervals and credible intervals for the model parameters. We apply the Markov Chain Monte Carlo (MCMC) techniques with the Random Walk Metropolis algorithm with Non-informative Prior and the Delayed Rejection Adoptive Metropolis (DRAM) algorithm to construct parameters' posterior distributions and then obtain credible intervals.
|
6 |
Protecting economic reform by seeking membership in liberal international organizationsSteen-Sprang, Louise Marie 16 October 2003 (has links)
No description available.
|
7 |
Utilisation des données historiques dans l'analyse régionale des aléas maritimes extrêmes : la méthode FAB / Using historical data in the Regional Analysis of extreme coastal events : the FAB methodFrau, Roberto 13 November 2018 (has links)
La protection des zones littorales contre les agressions naturelles provenant de la mer, et notamment contre le risque de submersion marine, est essentielle pour sécuriser les installations côtières. La prévention de ce risque est assurée par des protections côtières qui sont conçues et régulièrement vérifiées grâce généralement à la définition du concept de niveau de retour d’un événement extrême particulier. Le niveau de retour lié à une période de retour assez grande (de 1000 ans ou plus) est estimé par des méthodes statistiques basées sur la Théorie des Valeurs Extrêmes (TVE). Ces approches statistiques sont appliquées à des séries temporelles d’une variable extrême observée et permettent de connaître la probabilité d’occurrence de telle variable. Dans le passé, les niveaux de retour des aléas maritimes extrêmes étaient estimés le plus souvent à partir de méthodes statistiques appliquées à des séries d’observation locales. En général, les séries locales des niveaux marins sont observées sur une période limitée (pour les niveaux marins environ 50 ans) et on cherche à trouver des bonnes estimations des extrêmes associées à des périodes de retour très grandes. Pour cette raison, de nombreuses méthodologies sont utilisées pour augmenter la taille des échantillons des extrêmes et réduire les incertitudes sur les estimations. En génie côtier, une des approches actuellement assez utilisées est l’analyse régionale. L’analyse régionale est indiquée par Weiss (2014) comme une manière très performante pour réduire les incertitudes sur les estimations des événements extrêmes. Le principe de cette méthodologie est de profiter de la grande disponibilité spatiale des données observées sur différents sites pour créer des régions homogènes. Cela permet d’estimer des lois statistiques sur des échantillons régionaux plus étendus regroupant tous les événements extrêmes qui ont frappé un ou plusieurs sites de la région (...) Cela ainsi que le caractère particulier de chaque événement historique ne permet pas son utilisation dans une analyse régionale classique. Une méthodologie statistique appelée FAB qui permet de réaliser une analyse régionale tenant en compte les données historiques est développée dans ce manuscrit. Élaborée pour des données POT (Peaks Over Threshold), cette méthode est basée sur une nouvelle définition d’une durée d’observation, appelée durée crédible, locale et régionale et elle est capable de tenir en compte dans l’analyse statistique les trois types les plus classiques de données historiques (données ponctuelles, données définies par un intervalle, données au-dessus d’une borne inférieure). En plus, une approche pour déterminer un seuil d’échantillonnage optimal est définie dans cette étude. La méthode FAB est assez polyvalente et permet d’estimer des niveaux de retour soit dans un cadre fréquentiste soit dans un cadre bayésien. Une application de cette méthodologie est réalisée pour une base de données enregistrées des surcotes de pleine mer (données systématiques) et 14 surcotes de pleine mer historiques collectées pour différents sites positionnés le long des côtes françaises, anglaises, belges et espagnoles de l’Atlantique, de la Manche et de la mer du Nord. Enfin, ce manuscrit examine la problématique de la découverte et de la validation des données historiques / The protection of coastal areas against the risk of flooding is necessary to safeguard all types of waterside structures and, in particular, nuclear power plants. The prevention of flooding is guaranteed by coastal protection commonly built and verified thanks to the definition of the return level’s concept of a particular extreme event. Return levels linked to very high return periods (up to 1000 years) are estimated through statistical methods based on the Extreme Value Theory (EVT). These statistical approaches are applied to time series of a particular extreme variable observed and enables the computation of its occurrence probability. In the past, return levels of extreme coastal events were frequently estimated by applying statistical methods to time series of local observations. Local series of sea levels are typically observed in too short a period (for sea levels about 50 years) in order to compute reliable estimations linked to high return periods. For this reason, several approaches are used to enlarge the size of the extreme data samples and to reduce uncertainties of their estimations. Currently, one of the most widely used methods in coastal engineering is the Regional Analysis. Regional Analysis is denoted by Weiss (2014) as a valid means to reduce uncertainties in the estimations of extreme events. The main idea of this method is to take advantage of the wide spatial availability of observed data in different locations in order to form homogeneous regions. This enables the estimation of statistical distributions of enlarged regional data samples by clustering all extreme events occurred in one or more sites of the region. Recent investigations have highlighted the importance of using past events when estimating extreme events. When historical data are available, they cannot be neglected in order to compute reliable estimations of extreme events. Historical data are collected from different sources and they are identified as data that do not come from time series. In fact, in most cases, no information about other extreme events occurring before and after a historical observation is available. This, and the particular nature of each historical data, do not permit their use in a Regional Analysis. A statistical methodology that enables the use of historical data in a regional context is needed in order to estimate reliable return levels and to reduce their associated uncertainties. In this manuscript, a statistical method called FAB is developed enabling the performance of a Regional Analysis using historical data. This method is formulated for POT (Peaks Over Threshold) data. It is based on the new definition of duration of local and regional observation period (denominated credible duration) and it is able to take into account all the three typical kinds of historical data (exact point, range and lower limit value). In addition, an approach to identify an optimal sampling threshold is defined in this study. This allows to get better estimations through using the optimal extreme data sample in the FAB method.FAB method is a flexible approach that enables the estimation of return levels both in frequentist and Bayesian contexts. An application of this method is carried out for a database of recorded skew surges (systematic data) and for 14 historical skew surges recovered from different sites located on French, British, Belgian and Spanish coasts of the Atlantic Ocean, the English Channel and the North Sea. Frequentist and Bayesian estimations of skew surges are computed for each homogeneous region and for every site. Finally, this manuscript explores the issues surrounding the finding and validation of historical data
|
8 |
Learning from Incredible Commitments: Evolution and Impact of Bilateral Investment TreatiesMinhas, Shahryar Farooq January 2016 (has links)
<p>Ostensibly, BITs are the ideal international treaty. First, until just recently, they almost uniformly came with explicit dispute resolution mechanisms through which countries could face real costs for violation (Montt 2009). Second, the signing, ratification, and violation of them are easily accessible public knowledge. Thus countries presumably would face reputational costs for violating these agreements. Yet, these compliance devices have not dissuaded states from violating these agreements. Even more interestingly, in recent years, both developed and developing countries have moved towards modifying the investor-friendly provisions of these agreements. These deviations from the expectations of the credible commitment argument raise important questions about the field's assumptions regarding the ability of international treaties with commitment devices to effectively constrain state behavior.</p> / Dissertation
|
9 |
Two Essays in EconomicsShevyakhova, Elizaveta January 2009 (has links)
Thesis advisor: Arthur Lewbel / The thesis includes two essays. The first essay, Inequality Moments in Estimation of Discrete Games with Incomplete Information and Multiple Equilibria, develops a method for estimation of static discrete games with incomplete information, which delivers consistent estimates of parameters even when games have multiple equilibria. Every Bayes-Nash equilibrium in a discrete game of incomplete information is associated with a set of choice probabilities. I use maximum and minimum equilibrium choice probabilities as upper and lower bounds on empirical choice probabilities to construct moment inequalities. In general, estimation with moment inequalities results in partial identification. I show that point identification is achievable if the payoffs are functions of a sufficient number of explanatory variables with a real line domain and outcome-specific coefficients associated with them. The second essay, Tenancy Rent Control and Credible Commitment in Maintenance, co-authored with Richard Arnott, investigates the effect of tenancy rent control on maintenance and welfare. Under tenancy rent control, rents are regulated within a tenancy but not between tenancies. The essay analyzes the effects of tenancy rent control on housing quality, maintenance, and rehabilitation. Since the discounted revenue received over a fixed-duration tenancy depends only on the starting rent, intuitively the landlord has an incentive to spruce up the unit between tenancies in order to show it well, but little incentive to maintain the unit well during the tenancy. The essay formalizes this intuition, and presents numerical examples illustrating the efficiency loss from this effect. / Thesis (PhD) — Boston College, 2009. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Economics.
|
10 |
Bayesian modeling of neuropsychological test scoresDu, Mengtian 06 October 2021 (has links)
In this dissertation we propose novel Bayesian methods of analysis of patterns of neuropsychological testing. We first focus attention to situations in which the goal of the analysis is to discover risk factors of cognitive decline using longitudinal assessment of tests scores. Variable selection in the Bayesian setting is still challenging, particularly for analysis of longitudinal data. We propose a novel approach to selection of the fixed effects in mixed effect models that combines a backward selection algorithm and a metrics based on the posterior credible intervals of the model parameters. The heuristic of this approach is based on searching for those parameters that are most likely to be different from zero based on their posterior credible intervals, without requiring ad hoc approximations of model parameters or informative prior distributions. We show via a simulation study that this approach produces more parsimonious models than other popular criteria such as the Bayesian deviance information criterion. We then apply this approach to test the hypothesis that genotypes of the APOE gene have different effects on the rate of cognitive decline of participants in the Long Life Family Study. In the second part of the dissertation we shift focus on analysis of neuropsychological tests administered using emerging digital technologies. The challenge of analyzing these data is that for each study participant the test is a data stream that records time and spatial coordinates of the digitally executed test and the goal is to extract some useful and informative summary univariate variables that can be used for analysis. Toward this goal, we propose a novel application of Bayesian Hidden Markov Models to analyze digitally recorded Trail Making Tests. Applying the Hidden Markov Model enables us to perform automatic segmentation of the digital data stream and allows us to extract meaningful metrics that correlate the Trail Making Tests performance to other cognitive and physical function test scores. We show that the extracted metrics provide information in addition to the traditionally used scores. / 2023-10-06T00:00:00Z
|
Page generated in 0.0314 seconds