Spelling suggestions: "subject:"value theory"" "subject:"alue theory""
51 |
Bivariate extreme value analysis of commodity pricesJoyce, Matthew 21 April 2017 (has links)
The crude oil, natural gas, and electricity markets are among the most widely traded and talked about commodity markets across the world. Over the past two decades each commodity has seen price volatility due to political, economic, social, and technological reasons. With that comes a significant amount of risk that both corporations and governments must account for to ensure expected cash flows and to minimize losses. This thesis analyzes the portfolio risk of the major US commodity hubs for crude oil, natural gas and electricity by applying Extreme Value Theory to historical daily price returns between 2003 and 2013. The risk measures used to analyze risk are Value-at-Risk and Expected Shortfall, with these estimated by fitting the Generalized Pareto Distribution to the data using the peak-over-threshold method. We consider both the univariate and bivariate cases in order to determine the effects that price shocks within and across commodities will have in a mixed portfolio. The results show that electricity is the most volatile, and therefore most risky, commodity of the three markets considered for both positive and negative returns. In addition, we find that the univariate and bivariate results are statistically indistinguishable, leading to the conclusion that for the three markets analyzed during this period, price shocks in one commodity does not directly impact the volatility of another commodity’s price. / Graduate
|
52 |
How Low Can You Go? : Quantitative Risk Measures in Commodity MarketsForsgren, Johan January 2016 (has links)
The volatility model approach to forecasting Value at Risk is complemented with modelling of Expected Shortfalls using an extreme value approach. Using three models from the GARCH family (GARCH, EGARCH and GJR-GARCH) and assuming two conditional distributions, normal Gaussian and Student t’s distribution, to make predictions of VaR, the forecasts are used as a threshold for assigning losses to the distribution tail. The Expected Shortfalls are estimated assuming that the violations of VaR follow the Generalized Pareto distribution, and the estimates are evaluated. The results indicate that the most efficient model for making predictions of VaR is the asymmetric GJR-GARCH, and that assuming the t distribution generates conservative forecasts. In conclusion there is evidence that the commodities are characterized by asymmetry and conditional normality. Since no comparison is made, the EVT approach can not be deemed to be either superior or inferior to standard approaches to Expected Shortfall modeling, although the data intensity of the method suggest that a standard approach may be preferable.
|
53 |
Order-statistics-based inferences for censored lifetime data and financial risk analysisSheng, Zhuo January 2013 (has links)
This thesis focuses on applying order-statistics-based inferences on lifetime analysis and financial risk measurement. The first problem is raised from fitting the Weibull distribution to progressively censored and accelerated life-test data. A new orderstatistics- based inference is proposed for both parameter and con dence interval estimation. The second problem can be summarised as adopting the inference used in the first problem for fitting the generalised Pareto distribution, especially when sample size is small. With some modifications, the proposed inference is compared with classical methods and several relatively new methods emerged from recent literature. The third problem studies a distribution free approach for forecasting financial volatility, which is essentially the standard deviation of financial returns. Classical models of this approach use the interval between two symmetric extreme quantiles of the return distribution as a proxy of volatility. Two new models are proposed, which use intervals of expected shortfalls and expectiles, instead of interval of quantiles. Different models are compared with empirical stock indices data. Finally, attentions are drawn towards the heteroskedasticity quantile regression. The proposed joint modelling approach, which makes use of the parametric link between the quantile regression and the asymmetric Laplace distribution, can provide estimations of the regression quantile and of the log linear heteroskedastic scale simultaneously. Furthermore, the use of the expectation of the check function as a measure of quantile deviation is discussed.
|
54 |
What to do when there isn't enough : the fair distribution of scarce goodsVong, Gerard January 2012 (has links)
My DPhil submission consists of a series of papers on related topics on the moral philosophy of scarce benefit distribution. It focuses on two types of scarce benefit distribution case. The first type occurs when which all potential beneficiaries of a good each have an equally strong moral claim on an equal benefit from the resource but scarcity or indivisibility prevents us from benefiting all potential beneficiaries. Call these cases equal conflict cases. In 'Anti-Majoritarianism', I argue against the view defended by both utilitarians and non-utilitarians that in equal conflict cases you always ought to give the benefit to as many people as possible. I argue that doing so is neither morally right nor fair. In 'Weighing Up Weighted Lotteries', I argue that the philosophical debate between unweighted and weighted lottery benefit distribution procedures has been misconceived and that fairness requires us to use a new kind of weighted lottery that I call the exclusive composition-sensitive lottery. In 'Can't Get No Satisfaction', I defend a new view that I call the dual-structure view about how lotteries satisfy potential beneficiaries' claims in equal conflict cases and highlight the implications of that view for the distribution of donor corneas to those who have suffered corneal degeneration. The second type of this distributional problem occurs when we can either benefit a very large number of potential beneficiaries with a very small benefit (call these the many) or a very small number of potential beneficiaries with a very major benefit (call these the few). In "Valuing the Few Over the Many" I argue that there are cases where not only ought we to benefit the few over the many no matter how numerous the many are, but it is also better to do so. However, this conclusion can be shown to conflict with a number of widely held tenets of value theory. I evaluate different ways of accommodating these intuitions and argue that in some contexts, benefits are not of finite value. The view I defend in 'Valuing the Few Over the Many', combined with some intuitively plausible axiological claims, is inconsistent with the transitivity of the 'better than' relation. In 'Making Betterness Behave' I argue that for what I call the conditional non-coextensive thesis: if 'better than' is not transitive, one ought to take the position that 'more reason to bring about rather than' is transitive. I argue that one can generate a transitive 'more reason to bring about rather than' deontic ordering from a non-transitive axiological ordering in a principled way. This deontic ordering avoids the major practical objections (money pumps, moral dilemmas and threats to practical reasoning) to non-transitivity of the 'better than' relation.
|
55 |
DEFINING VALUE BASED INFORMATION SECURITY GOVERNANCE OBJECTIVESMishra, Sushma 09 December 2008 (has links)
This research argues that the information security governance objectives should be grounded in the values of organizational members. Research literature in decision sciences suggest that individual values play an important role in developing decision objectives. Information security governance objectives, based on values of the stakeholders, are essential for a comprehensive security control program. The study uses Value Theory as a theoretical basis and value focused thinking as a methodology to develop 23 objectives for information security governance. A case study was conducted to reexamine and interpret the significance of the proposed objectives in an organizational context. The results suggest three emergent dimensions of information security governance for effective control structure in organizations: resource allocation, user involvement and process integrity. The synthesis of data suggests eight principles of information security governance which guides organizations in achieving a comprehensive security environment. We also present a means-end model of ISG which proposes the interrelationships of the developed objectives. Contributions are noted and future research directions suggested.
|
56 |
Measuring Extremes: Empirical Application on European MarketsÖztürk, Durmuş January 2015 (has links)
This study employs Extreme Value Theory and several univariate methods to compare their Value-at-Risk and Expected Shortfall predictive performance. We conduct several out-of-sample backtesting procedures, such as uncondi- tional coverage, independence and conditional coverage tests. The dataset in- cludes five different stock markets, PX50 (Prague, Czech Republic), BIST100 (Istanbul, Turkey), ATHEX (Athens, Greece), PSI20 (Lisbon, Portugal) and IBEX35 (Madrid, Spain). These markets have different financial histories and data span over twenty years. We analyze the global financial crisis period sep- arately to inspect the performance of these methods during the high volatility period. Our results support the most common findings that Extreme Value Theory is one of the most appropriate risk measurement tools. In addition, we find that GARCH family of methods, after accounting for asymmetry and fat tail phenomena, can be equally useful and sometimes even better than Extreme Value Theory based method in terms of risk estimation. Keywords Extreme Value Theory, Value-at-Risk, Expected Shortfall, Out-of-Sample Backtesting Author's e-mail ozturkdurmus@windowslive.com Supervisor's e-mail ies.avdulaj@gmail.com
|
57 |
Motivation i matematik ur ett lärarperspektiv åk 5-6Ehrling, Fredrik January 2019 (has links)
Matematik är ett av grundskolans kärnämnen, och matematiska kunskaper är avgörande för att människan ska klara vardagen i det moderna samhället. Elever är olika motiverade att lära sig matematik, och lärare har olika sätt att arbeta för att motivera dem. Studien syftar till att belysa hur lärare ser på motivation i matematikämnet och vad lärare gör för att motivera sina elever och därigenom få dem att fördjupa ämneskunskaperna. Förväntan-värde-teorin (expectancy-value theory) delar upp motivation i inre värde, nyttovärde, personligt värde och kostnad. Med den teorin som grund har jag med kvalitativ metod intervjuat lärare. Analysen av materialet visar att det är viktigt att läraren har förtroende för eleverna och en realistisk förväntan på vad eleverna redan kan i matematikämnet. Nyttovärdet betonas mest och kopplas ofta till läroplanens syfte med matematikämnet.
|
58 |
What motivates A-level students to achieve? : the role of expectations and valuesBrown, Carol January 2016 (has links)
Eccles' expectancy-value model of achievement motivation suggests that beliefs about ability and expectations for success are a strong predictor of grades and differences in task value underlie differences in motivation and achievement. This model has not been previously investigated in the context of high stakes examinations in the UK and this study therefore explores the relationships between expectations, values and A-level achievement in 930 students. This is important given the significance of these qualifications for future life pathways. Furthermore, studies examining the subjective task value (STV) patterns across school subjects, rather than domain specific ones, are rare, highlighting the additional importance of this work. A mixed methods design was used. A questionnaire collected information on a student's background (SES, gender, ethnicity), the expectations and STV attached to A-levels, and their future and general life expectations and values. Some of these relationships were also explored using 20 semi-structured interviews. The qualitative data illustrated that studying A-levels confirmed aspects of students' identity but also facilitated changes to their goals and academic skills, having positive effects, contrary to the argument that high stakes assessment has a negative impact on individuals. Unsurprisingly parents and teachers were perceived to be influential. As predicted, expectations and values were related to A-level achievement. As there is a lack of research into the effects of these variables on A-level outcomes these findings are valuable. Eccles' original three factor model of STV could not, however, be supported. In this research the utility construct was removed. Further exploration of the STV construct is warranted. Socio-economic status was positively related to both achievement and expectations about achievement. Girls had lower expectations but placed higher value on their A-levels. There were, however, no gender differences in achievement. Employing the expectancy-value model in this UK context has been useful in explaining the motivational patterns underlying A-level qualifications and the findings have implications for enhancing outcomes and narrowing educational gaps in this student population.
|
59 |
FACTORS INFLUENCING MUSIC THERAPY CAREER CHOICE IN THE UNITED STATES: A STUDY OF STUDENT VALUES AND EXPECTANCIESScheppmann, Margaret R. 01 January 2019 (has links)
The understanding of why students decide to become music therapists is valuable information for music therapy educators and policy makers and published information is lacking. The use of expectancy-value theory provides a framework to understand student choices. Researchers can better understand why students purse a career in music therapy by examining students’ abilities, beliefs, expectancies, and values related to the choice. The purpose of this study was to examine why current undergraduate and equivalency students want to be music therapists. Music therapy students (N = 129) throughout the United States provided insight into their decision to become music therapists by completing a survey with questions about their expectancies and values. Results indicated that both undergraduate and equivalency students tended to choose to be music therapists because they expected music therapy to be a career that requires hard work and expert knowledge while maintaining their interest and morale. Results of a correlational analysis suggest there are several choice-making variables that may influence each other, indicating that the decision to become a music therapist is a complex process for students. Finally, the results of a Mann-Whitney U test suggested that there was no significant difference between the expectancies and values that influenced undergraduate and equivalency students’ choices to become music therapists. Music therapists in many capacities may use this information to improve recruitment and engagement of music therapy students.
|
60 |
Remanufacturing business model experimentation in fashion and textiles : Learnings from a pilot project.Hoehn, Caroline, Herzog, Laetitia Muriel January 2019 (has links)
Adapting a circular system through business model experimentation can generate profit and sustainable growth for fashion firms. Business model experimentation explores novel opportunities to be at the forefront of transforming existing markets. Remanufacturing is one circular strategy that entails the process of recovering both raw material and value from end-of-life products for the production of new items. Remanufacturing in the context of business model experimentation is a promising solution in the fashion and textile industry to drive the transition into a circular economy. Through case study research the phenomenon of remanufacturing business model experimentation within the fashion and textile industry is investigated. The case phenomenon is investigated by means of the Re:workwear project, in which the brand Cheap Monday uses discarded workwear for a remanufactured collection next to its common collection. The focus of the study is on remanufacturing business model experimentation alongside the brand’s business-as-usual and decisive factors of this phenomenon. The processes and -steps within experimentation were analysed through semi-structured interviews with various involved parties of the supply chain. A framework combining the Business Model Canvas and the stepwise approach of business model experimentation by Bocken et al. (2017) is developed and applied throughout the research. It is found that (1) motivation and scope, (2) input material, (3) flexibility, (4) stakeholder collaboration and (5) system development are decisive factors for remanufacturing business model experimentation. Further research is necessary to investigate the phenomenon in other settings and within a variety of other firms in the industry in order to test the findings and validate the generalisability.
|
Page generated in 0.0668 seconds