Spelling suggestions: "subject:"value theory"" "subject:"alue theory""
251 |
Rewarding Corporate Social Responsibility (CSR) Through CSR Communication: Exploring Spillover Effects in Retailer Private Brands and Loyalty ProgramsHwang, Jiyoung 17 December 2010 (has links)
No description available.
|
252 |
The Performance of Market Risk Models for Value at Risk and Expected Shortfall Backtesting : In the Light of the Fundamental Review of the Trading Book / Bakåttest av VaR och ES i marknadsriskmodellerDalne, Katja January 2017 (has links)
The global financial crisis that took off in 2007 gave rise to several adjustments of the risk regulation for banks. An extensive adjustment, that is to be implemented in 2019, is the Fundamental Review of the Trading Book (FRTB). It proposes to use Expected Shortfall (ES) as risk measure instead of the currently used Value at Risk (VaR), as well as applying varying liquidity horizons based on the various risk levels of the assets involved. A major difficulty of implementing the FRTB lies within the backtesting of ES. Righi and Ceretta proposes a robust ES backtest based on Monte Carlo simulation. It is flexible since it does not assume any probability distribution and can be performed without waiting for an entire backtesting period. Implementing some commonly used VaR backtests as well as the ES backtest by Righi and Ceretta, yield a perception of which risk models that are the most accurate from both a VaR and an ES backtesting perspective. It can be concluded that a model that is satisfactory from a VaR backtesting perspective does not necessarily remain so from an ES backtesting perspective and vice versa. Overall, the models that are satisfactory from a VaR backtesting perspective turn out to be probably too conservative from an ES backtesting perspective. Considering the confidence levels proposed by the FRTB, from a VaR backtesting perspective, a risk measure model with a normal copula and a hybrid distribution with the generalized Pareto distribution in the tails and the empirical distribution in the center along with GARCH filtration is the most accurate one, as from an ES backtesting perspective a risk measure model with univariate Student’s t distribution with ⱱ ≈ 7 together with GARCH filtration is the most accurate one for implementation. Thus, when implementing the FRTB, the bank will need to compromise between obtaining a good VaR model, potentially resulting in conservative ES estimates, and obtaining a less satisfactory VaR model, possibly resulting in more accurate ES estimates. The thesis was performed at SAS Institute, an American IT company that develops software for risk management among others. Targeted customers are banks and other financial institutions. Investigating the FRTB acts a potential advantage for the company when approaching customers that are to implement the regulation framework in a near future. / Den globala finanskrisen som inleddes år 2007 ledde till flertalet ändringar vad gäller riskreglering för banker. En omfattande förändring som beräknas implementeras år 2019, utgörs av Fundamental Review of the Trading Book (FRTB). Denna föreslår bland annat användande av Expected Shortfall (ES) som riskmått istället för Value at Risk (VaR) som används idag, liksom tillämpandet av varierande likviditetshorisonter beroende på risknivåerna för tillgångarna i fråga. Den huvudsakliga svårigheten med att implementera FRTB ligger i backtestingen av ES. Righi och Ceretta föreslår ett robust ES backtest som baserar sig på Monte Carlo-simulering. Det är flexibelt i den mening att det inte antar någon specifik sannolikhetsfördelning samt att det går att implementera utan att man behöver vänta en hel backtestingperiod. Vid implementation av olika standardbacktest för VaR, liksom backtestet för ES av Righi och Ceretta, fås en uppfattning av vilka riskmåttsmodeller som ger de mest korrekta resultaten från både ett VaR- och ES-backtestingperspektiv. Sammanfattningsvis kan man konstatera att en modell som är acceptabel från ett VaR-backtestingperspektiv inte nödvändigtvis är det från ett ES-backtestingperspektiv och vice versa. I det hela taget har det visat sig att de modeller som är acceptabla ur ett VaR-backtestingperspektiv troligtvis är för konservativa från ett ESbacktestingperspektiv. Om man betraktar de konfidensnivåer som föreslagits i FRTB, kan man ur ett VaR-backtestingperspektiv konstatera att en riskmåttsmodell med normal-copula och en hybridfördelning med generaliserad Pareto-fördelning i svansarna och empirisk fördelning i centrum tillsammans med GARCH-filtrering är den bäst lämpade, medan det från ett ES-backtestingperspektiv är att föredra en riskmåttsmodell med univariat Student t-fördelning med ⱱ ≈ 7 tillsammans med GARCH-filtrering. Detta innebär att när banker ska implementera FRTB kommer de behöva kompromissa mellan att uppnå en bra VaR-modell som potentiellt resulterar i för konservativa ES-estimat och en modell som är mindre bra ur ett VaRperspektiv men som resulterar i rimligare ES-estimat. Examensarbetet genomfördes vid SAS Institute, ett amerikanskt IT-företag som bland annat utvecklar mjukvara för riskhantering. Tänkbara kunder är banker och andra finansinstitut. Denna studie av FRTB innebär en potentiell fördel för företaget vid kontakt med kunder som planerar implementera regelverket inom en snar framtid. / Riskhantering, finansiella tidsserier, Value at Risk, Expected Shortfall, Monte Carlo-simulering, GARCH-modellering, Copulas, hybrida distributioner, generaliserad Pareto-fördelning, extremvärdesteori, Backtesting, likviditetshorisonter, Basels regelverk
|
253 |
Éthique des populations : une étude des fondements axiologiques propres aux grandes familles utilitaristesArveiller, Octave 07 1900 (has links)
Dans le contexte utilitariste de l’éthique des populations, deux positions s’opposent naturellement à propos de ce qui a de la valeur et de ce que nous devons faire afin de bénéficier au plus grand nombre. Là où certaines intuitions nous invitent à chercher la maximisation totale du bien-être, d’autres soulignent l’importance de faire de notre mieux pour parvenir à la plus haute moyenne possible. Ce document a pour vocation de traiter le débat théorique qui oppose ces deux avenues. Il s’agira donc de décrire et de préciser le fonctionnement et les conséquences de ces positions, afin de tenter d’apporter des éléments de réponses sur la solidité et la robustesse de ces deux axiologies. Dans ce but, nous évaluerons et répondrons aux objections qui leurs sont présentées, et ferons la lumière sur leurs lacunes respectives. Plus précisément, ce document s’intéressera principalement aux problèmes de la Conclusion Répugnante et de la Conclusion Sadique, qui ont fait couler beaucoup d’encre, mais également à d’autres objections couramment discutées dans la littérature. Nous conclurons que contrairement à ce qu’ils peuvent laisser croire, ce ne sont pas des arguments décisifs à l’encontre de ces positions. Il s’agira de surcroît d’interroger et d’expliquer les intuitions qui fondent ces positions, les biais qui les menacent mais également les conséquences contre-intuitives qui en découlent. / In the utilitarian context of population ethics, two positions are naturally opposed about what has value and what we ought to do to benefit the greatest number. Where some intuitions invite us to seek total well-being, others stress the importance of doing our best to achieve the highest possible average. This paper aims to adress the theoretical debate between these two avenues. It will therefore be necessary to describe and clarify the functioning and consequences of these positions, in order to try to provide elements of answers about the solidity and robustness of these two axiologies. To this end, we will assess and respond to the objections presented to them, and shed light on their respective shortcomings. More precisely, this paper will focus on the problems of the Repugnant Conclusion and the Sadistic Conclusion, which have caused a lot of ink to flow, but also on other objections commonly discussed in the litterature. We will conclude that contrary to what they may suggest, these are not decisive arguments against these positions. In addition, the document will question and explain the intuitions that underlie these positions, the biases that threaten them but alo the counter-intuitive consequences thats follow.
|
254 |
Motivation till läsning av skönlitteratur : En kvalitativ studie om hur lärare vill främja motivation till läsning av skönlitteratur samt vilka utmaningar de möterKristherzon, Sophia, Nilsson, Johannes January 2024 (has links)
Syftet med den här studien var att undersöka vilka arbetssätt lärare upplever bidrar till att elever känner motivation till läsning av skönlitteratur samt att ta reda på vilka utmaningar lärare står inför i det arbetet. Studien genomfördes med en kvalitativ metod med semistrukturerade intervjuer med tio informanter. Den tidigare forskningen som ligger till grund för studien handlar om hur skönlitteratur och läsning hänger ihop, hur läsning och motivation hänger ihop samt hur lärare arbetar med läsning. Studiens teoretiska grund bygger på den sociokulturella teorin av Lev Vygotskij samt expectancy-value theory som är grundad av Jacquelynne Eccles och hennes kollegor. Det resultat som studien visade var att lärarna som deltog i studien har flera metoder för att försöka ge eleverna motivation till läsning av skönlitteratur som placering, arbete med val av böcker, använda yttre aktörer och högläsning. De utmaningar som lärare upplever beskriver de som arbetet mot hemmen, elevens sociala situation, gruppstorleken samt dynamiken i gruppen. / The purpose of this study was to explore what working methods teachers consider having a contribution to the feeling of motivation in students regarding reading fiction and also what challenges teachers face with this work. This study was conducted using a qualitative method with semi-structured interviews with ten informants. The previous research that is the foundation for this study reflects how reading of fiction and reading are connected, how reading and motivation are connected and how teachers work with reading in the classroom. The study's theoretical basis is based on the sociocultural theory of Lev Vygotsky and expectancy-value theory, which was founded by Jacquelynne Eccles and her colleagues. The result of this study showed that the teachers that were participating used multiple methods in their work trying to motivate students to read fiction. Among these are placement in the classroom, thought into choosing books, the usage of external actors and reading aloud. The challenges that teachers face are described by the informers in this study as the collaboration with the home, the students social situation, group size and the dynamic of the group.
|
255 |
Dynamic portfolio construction and portfolio risk measurementMazibas, Murat January 2011 (has links)
The research presented in this thesis addresses different aspects of dynamic portfolio construction and portfolio risk measurement. It brings the research on dynamic portfolio optimization, replicating portfolio construction, dynamic portfolio risk measurement and volatility forecast together. The overall aim of this research is threefold. First, it is aimed to examine the portfolio construction and risk measurement performance of a broad set of volatility forecast and portfolio optimization model. Second, in an effort to improve their forecast accuracy and portfolio construction performance, it is aimed to propose new models or new formulations to the available models. Third, in order to enhance the replication performance of hedge fund returns, it is aimed to introduce a replication approach that has the potential to be used in numerous applications, in investment management. In order to achieve these aims, Chapter 2 addresses risk measurement in dynamic portfolio construction. In this chapter, further evidence on the use of multivariate conditional volatility models in hedge fund risk measurement and portfolio allocation is provided by using monthly returns of hedge fund strategy indices for the period 1990 to 2009. Building on Giamouridis and Vrontos (2007), a broad set of multivariate GARCH models, as well as, the simpler exponentially weighted moving average (EWMA) estimator of RiskMetrics (1996) are considered. It is found that, while multivariate GARCH models provide some improvements in portfolio performance over static models, they are generally dominated by the EWMA model. In particular, in addition to providing a better risk-adjusted performance, the EWMA model leads to dynamic allocation strategies that have a substantially lower turnover and could therefore be expected to involve lower transaction costs. Moreover, it is shown that these results are robust across the low - volatility and high-volatility sub-periods. Chapter 3 addresses optimization in dynamic portfolio construction. In this chapter, the advantages of introducing alternative optimization frameworks over the mean-variance framework in constructing hedge fund portfolios for a fund of funds. Using monthly return data of hedge fund strategy indices for the period 1990 to 2011, the standard mean-variance approach is compared with approaches based on CVaR, CDaR and Omega, for both conservative and aggressive hedge fund investors. In order to estimate portfolio CVaR, CDaR and Omega, a semi-parametric approach is proposed, in which first the marginal density of each hedge fund index is modelled using extreme value theory and the joint density of hedge fund index returns is constructed using a copula-based approach. Then hedge fund returns from this joint density are simulated in order to compute CVaR, CDaR and Omega. The semi-parametric approach is compared with the standard, non-parametric approach, in which the quantiles of the marginal density of portfolio returns are estimated empirically and used to compute CVaR, CDaR and Omega. Two main findings are reported. The first is that CVaR-, CDaR- and Omega-based optimization offers a significant improvement in terms of risk-adjusted portfolio performance over mean-variance optimization. The second is that, for all three risk measures, semi-parametric estimation of the optimal portfolio offers a very significant improvement over non-parametric estimation. The results are robust to as the choice of target return and the estimation period. Chapter 4 searches for improvements in portfolio risk measurement by addressing volatility forecast. In this chapter, two new univariate Markov regime switching models based on intraday range are introduced. A regime switching conditional volatility model is combined with a robust measure of volatility based on intraday range, in a framework for volatility forecasting. This chapter proposes a one-factor and a two-factor model that combine useful properties of range, regime switching, nonlinear filtration, and GARCH frameworks. Any incremental improvement in the performance of volatility forecasting is searched for by employing regime switching in a conditional volatility setting with enhanced information content on true volatility. Weekly S&P500 index data for 1982-2010 is used. Models are evaluated by using a number of volatility proxies, which approximate true integrated volatility. Forecast performance of the proposed models is compared to renowned return-based and range-based models, namely EWMA of Riskmetrics, hybrid EWMA of Harris and Yilmaz (2009), GARCH of Bollerslev (1988), CARR of Chou (2005), FIGARCH of Baillie et al. (1996) and MRSGARCH of Klaassen (2002). It is found that the proposed models produce more accurate out of sample forecasts, contain more information about true volatility and exhibit similar or better performance when used for value at risk comparison. Chapter 5 searches for improvements in risk measurement for a better dynamic portfolio construction. This chapter proposes multivariate versions of one and two factor MRSACR models introduced in the fourth chapter. In these models, useful properties of regime switching models, nonlinear filtration and range-based estimator are combined with a multivariate setting, based on static and dynamic correlation estimates. In comparing the out-of-sample forecast performance of these models, eminent return and range-based volatility models are employed as benchmark models. A hedge fund portfolio construction is conducted in order to investigate the out-of-sample portfolio performance of the proposed models. Also, the out-of-sample performance of each model is tested by using a number of statistical tests. In particular, a broad range of statistical tests and loss functions are utilized in evaluating the forecast performance of the variance covariance matrix of each portfolio. It is found that, in terms statistical test results, proposed models offer significant improvements in forecasting true volatility process, and, in terms of risk and return criteria employed, proposed models perform better than benchmark models. Proposed models construct hedge fund portfolios with higher risk-adjusted returns, lower tail risks, offer superior risk-return tradeoffs and better active management ratios. However, in most cases these improvements come at the expense of higher portfolio turnover and rebalancing expenses. Chapter 6 addresses the dynamic portfolio construction for a better hedge fund return replication and proposes a new approach. In this chapter, a method for hedge fund replication is proposed that uses a factor-based model supplemented with a series of risk and return constraints that implicitly target all the moments of the hedge fund return distribution. The approach is used to replicate the monthly returns of ten broad hedge fund strategy indices, using long-only positions in ten equity, bond, foreign exchange, and commodity indices, all of which can be traded using liquid, investible instruments such as futures, options and exchange traded funds. In out-of-sample tests, proposed approach provides an improvement over the pure factor-based model, offering a closer match to both the return performance and risk characteristics of the hedge fund strategy indices.
|
256 |
Risks in Commodity and Currency MarketsBozovic, Milos 17 April 2009 (has links)
This thesis analyzes market risk factors in commodity and currency markets. It focuses on the impact of extreme events on the prices of financial products traded in these markets, and on the overall market risk faced by the investors. The first chapter develops a simple two-factor jump-diffusion model for valuation of contingent claims on commodities in order to investigate the pricing implications of shocks that are exogenous to this market. The second chapter analyzes the nature and pricing implications of the abrupt changes in exchange rates, as well as the ability of these changes to explain the shapes of option-implied volatility "smiles". Finally, the third chapter employs the notion that key results of the univariate extreme value theory can be applied separately to the principal components of ARMA-GARCH residuals of a multivariate return series. The proposed approach yields more precise Value at Risk forecasts than conventional multivariate methods, while maintaining the same efficiency. / El objetivo de esta tesis es analizar los factores del riesgo del mercado de las materias primas y las divisas. Está centrada en el impacto de los eventos extremos tanto en los precios de los productos financieros como en el riesgo total de mercado al cual se enfrentan los inversores. En el primer capítulo se introduce un modelo simple de difusión y saltos (jump-diffusion) con dos factores para la valuación de activos contingentes sobre las materias primas, con el objetivo de investigar las implicaciones de shocks en los precios que son exógenos a este mercado. En el segundo capítulo se analiza la naturaleza e implicaciones para la valuación de los saltos en los tipos de cambio, así como la capacidad de éstos para explicar las formas de sonrisa en la volatilidad implicada. Por último, en el tercer capítulo se utiliza la idea de que los resultados principales de la Teoria de Valores Extremos univariada se pueden aplicar por separado a los componentes principales de los residuos de un modelo ARMA-GARCH de series multivariadas de retorno. El enfoque propuesto produce pronósticos de Value at Risk más precisos que los convencionales métodos multivariados, manteniendo la misma eficiencia.
|
257 |
Modelling of extremesHitz, Adrien January 2016 (has links)
This work focuses on statistical methods to understand how frequently rare events occur and what the magnitude of extreme values such as large losses is. It lies in a field called extreme value analysis whose scope is to provide support for scientific decision making when extreme observations are of particular importance such as in environmental applications, insurance and finance. In the univariate case, I propose new techniques to model tails of discrete distributions and illustrate them in an application on word frequency and multiple birth data. Suitably rescaled, the limiting tails of some discrete distributions are shown to converge to a discrete generalized Pareto distribution and generalized Zipf distribution respectively. In the multivariate high-dimensional case, I suggest modeling tail dependence between random variables by a graph such that its nodes correspond to the variables and shocks propagate through the edges. Relying on the ideas of graphical models, I prove that if the variables satisfy a new notion called asymptotic conditional independence, then the density of the joint distribution can be simplified and expressed in terms of lower dimensional functions. This generalizes the Hammersley- Clifford theorem and enables us to infer tail distributions from observations in reduced dimension. As an illustration, extreme river flows are modeled by a tree graphical model whose structure appears to recover almost exactly the actual river network. A fundamental concept when studying limiting tail distributions is regular variation. I propose a new notion in the multivariate case called one-component regular variation, of which Karamata's and the representation theorem, two important results in the univariate case, are generalizations. Eventually, I turn my attention to website visit data and fit a censored copula Gaussian graphical model allowing the visualization of users' behavior by a graph.
|
258 |
Värdering av förvaltningsfastigheter – En kvalitativ studie ur tre perspektiv / Valuation of investment properties – A qualitative study from three perspectivesLindgren, Carl-Johan, Ivarsson, Jesper January 2017 (has links)
Det har sedan länge konstaterats att förvaltningsfastigheter utgör en central roll i världsekonomin. Inte minst har detta uppdagats under kriser såsom den globala finanskrisen år 2008. Själva värdet på en förvaltningsfastighet är därför inte av intresse endast för fastighetsbolaget i fråga, utan även för intressenter. Sedan år 2005 ska noterade företag inom EU upprätta koncernredovisning i enlighet med IFRS vilket innebär att värdering sker till verkligt värde. År 2013 antogs IFRS 13 Värdering till verkligt värde av EU vilken med hjälp av en hierarkisk värderingsmodell på ett tydligare vis ska definiera värderingsprocessen. Värderingsarbetet sker antingen genom intern värdering eller extern värdering och granskning av denna process genomförs av revisorer. Det finns inte några formella krav på att extern värdering ska tillämpas utan detta är endast en uppmuntran.Syftet med studien är att belysa värderingsprocessen av förvaltningsfastigheter ur tre perspektiv. De perspektiv vi valt att utgå ifrån är interna värderare, externa värderare samt revisorer. Detta för att åtnjuta en så representativ bild som möjligt av själva värderingsprocessen. Studien utgår från en abduktiv forskningsansats där vi valt att genomföra semistrukturerade intervjuer, således är metoden för denna studie kvalitativ. Sju intervjuer har genomförts varav två har skett med interna värderare, två med auktoriserade externa värderare samt tre med auktoriserade revisorer. Tonvikt har lagts på revisorer då dessa antas inneha bäst kunskaper rörande IFRS för att få en så djup förståelse som möjligt för värderingsprocessen av förvaltningsfastigheter.Studien visar att på att översättningen från engelskans ”fair value” till svenskans ”verkligt värde” är olycklig och stundtals direkt missvisande. Anledningen är att det verkliga värdet inte nödvändigtvis behöver vara exakt eller just verkligt. Det är snarare en uppskattning av värdet vilket således borde benämnas rimligt värde, rättvisande värde eller alternativt marknadsvärde som dessutom är det begrepp som används inom branschen.Studien visar även på att ett formellt kunskapskrav genomsyrar branschen till följd av att värderingsprocessen är komplex och svårbegriplig. Finansiella rapporter är sällan tillfredsställande uttömmande vilket resulterar i en informationsasymmetri gentemot de intressenter som inte har eller kan åtnjuta samma kunskaper.Värderingen av förvaltningsfastigheter sker uteslutandet till indata på Nivå 3 vilken är den lägsta nivån inom värderingshierarkin i IFRS 13. Värdering till indata på Nivå 3 förklaras med att varje fastighet är unik i sitt slag. Därför menar respondenterna att det inte går att applicera ortsprismetoden, vilket innebär att fastigheten värderas till ett pris utifrån faktiska transaktioner på en aktiv marknad. Ortsprismetoden definieras enligt IFRS 13 som indata på Nivå 1 och är att ses som huvudmetod.Av studien framgår att det ryms mycket problematik vid värdering till verkligt värde och att IFRS 13 är för generellt för värdering av förvaltningsfastigheter. Trots de negativa aspekter som lyfts fram i studien är en förändring av rådande regelverk ingenting som förespråkas av respondenterna. / For a long time, it has been found that investment properties have a significant role in the world economy. This has been discovered during crises such as the global financial crisis in the year of 2008. The value of an investment property is therefore not only of interest for the real estate company in question, but also for stakeholders. Since 2005, listed companies in the EU must draw up consolidated accounts in accordance with IFRS, which means that valuation is carried at fair value. In 2013, IFRS 13 was adopted, which by means of a hierarchical valuation model should clearly define the valuation process. The valuation work takes place either through internal or external valuation and review of this process is carried out by auditors. There are no formal requirements for external valuation being used, however it is an encouragement.The purpose of the study is to highlight the valuation process of investment properties from three perspectives. The perspectives are represented by internal evaluators, external evaluators and auditors. This is meant to provide a more balanced representation of the actual valuation process. The study is based on an abductive research effort where we chose to carry out semi-structured interviews. Thus, the method of this study is qualitative. Seven interviews have been conducted, two of which have been conducted with two internal evaluators, two with external evaluators and three with auditors. The interviews conducted are meant to provide the reader with more context and a deeper understanding of the valuation process in relation to investment properties. Emphasis has been on auditors as they are believed to possess the best knowledge regarding IFRS.The study shows that the translation from the English concept “fair value” to the Swedish concept "verkligt värde" is unfortunate and sometimes directly misleading. The reason is that the fair value does not necessarily have to be exact which the Swedish concept suggests but rather an estimate of the value. The translation for the Swedish concept for fair value would be “true value” if translated back to English. For this reason, we believe that the concept of use should rather be called “rimligt värde” which is the exact translation from fair value to Swedish, or market value as it is being used in the field.The study also shows that a formal knowledge requirement pervades the industry because of the evaluation process being complex and difficult to comprehend. Financial reports are rarely satisfactory, resulting in an information asymmetry vis-à-vis some stakeholders.The valuation of investment properties is made exclusively to level 3 inputs, which is explained by the fact that each property is unique in its kind. Hence, it is not possible to apply the comparative method which by theory and prevailing rules are to be considered the main method. Despite the many deficiencies highlighted in the study, a change in the prevailing regulations is nothing advocated by the respondents. This even though the study finds that IFRS 13 is too general for valuation of investment properties.~ III ~This thesis is written in Swedish.
|
259 |
Modeling sea-level rise uncertainties for coastal defence adaptation using belief functions / Utilisation des fonctions de croyance pour la modélisation des incertitudes dans les projections de l'élévation du niveau marin pour l'adaptation côtièreBen Abdallah, Nadia 12 March 2014 (has links)
L’adaptation côtière est un impératif pour faire face à l’élévation du niveau marin,conséquence directe du réchauffement climatique. Cependant, la mise en place d’actions et de stratégies est souvent entravée par la présence de diverses et importantes incertitudes lors de l’estimation des aléas et risques futurs. Ces incertitudes peuvent être dues à une connaissance limitée (de l’élévation du niveau marin futur par exemple) ou à la variabilité naturelle de certaines variables (les conditions de mer extrêmes). La prise en compte des incertitudes dans la chaîne d’évaluation des risques est essentielle pour une adaptation efficace.L’objectif de ce travail est de proposer une méthodologie pour la quantification des incertitudes basée sur les fonctions de croyance – un formalisme de l’incertain plus flexible que les probabilités. Les fonctions de croyance nous permettent de décrire plus fidèlement l’information incomplète fournie par des experts (quantiles,intervalles, etc.), et de combiner différentes sources d’information. L’information statistique peut quand à elle être décrite par de fonctions des croyance définies à partir de la fonction de vraisemblance. Pour la propagation d’incertitudes, nous exploitons l’équivalence mathématique entre fonctions de croyance et intervalles aléatoires, et procédons par échantillonnage Monte Carlo. La méthodologie est appliquée dans l’estimation des projections de la remontée du niveau marin global à la fin du siècle issues de la modélisation physique, d’élicitation d’avis d’experts, et de modèle semi-empirique. Ensuite, dans une étude de cas, nous évaluons l’impact du changement climatique sur les conditions de mers extrêmes et évaluons le renforcement nécessaire d’une structure afin de maintenir son niveau de performance fonctionnelle. / Coastal adaptation is an imperative to deal with the elevation of the global sealevel caused by the ongoing global warming. However, when defining adaptationactions, coastal engineers encounter substantial uncertainties in the assessment of future hazards and risks. These uncertainties may stem from a limited knowledge (e.g., about the magnitude of the future sea-level rise) or from the natural variabilityof some quantities (e.g., extreme sea conditions). A proper consideration of these uncertainties is of principal concern for efficient design and adaptation.The objective of this work is to propose a methodology for uncertainty analysis based on the theory of belief functions – an uncertainty formalism that offers greater features to handle both aleatory and epistemic uncertainties than probabilities.In particular, it allows to represent more faithfully experts’ incomplete knowledge (quantiles, intervals, etc.) and to combine multi-sources evidence taking into account their dependences and reliabilities. Statistical evidence can be modeledby like lihood-based belief functions, which are simply the translation of some inference principles in evidential terms. By exploiting the mathematical equivalence between belief functions and random intervals, uncertainty can be propagated through models by Monte Carlo simulations. We use this method to quantify uncertainty in future projections of the elevation of the global sea level by 2100 and evaluate its impact on some coastal risk indicators used in coastal design. Sea-level rise projections are derived from physical modelling, expert elicitation, and historical sea-level measurements. Then, within a methodologically-oriented case study,we assess the impact of climate change on extreme sea conditions and evaluate there inforcement of a typical coastal defence asset so that its functional performance is maintained.
|
260 |
Stochastic Modelling of Daily Peak Electricity Demand Using Value TheoryBoano - Danquah, Jerry 21 September 2018 (has links)
MSc (Statistics) / Department of Statistics / Daily peak electricity data from ESKOM, South African power utility company for the period, January
1997 to December 2013 consisting of 6209 observations were used in this dissertation. Since 1994, the
increased electricity demand has led to sustainability issues in South Africa. In addition, the electricity
demand continues to rise everyday due to a variety of driving factors. Considering this, if the electricity
generating capacity in South Africa does not show potential signs of meeting the country’s demands in
the subsequent years, this may have a significant impact on the national grid causing it to operate in a
risky and vulnerable state, leading to disturbances, such as load shedding as experienced during the past
few years. In particular, it is of greater interest to have sufficient information about the extreme value
of the stochastic load process in time for proper planning, designing the generation and distribution
system, and the storage devices as these would ensure efficiency in the electrical energy in order to
maintain discipline in the grid systems.
More importantly, electricity is an important commodity used mainly as a source of energy in industrial,
residential and commercial sectors. Effective monitoring of electricity demand is of great importance
because demand that exceeds maximum power generated will lead to power outage and load shedding.
It is in the light of this that the study seeks to assess the frequency of occurrence of extreme peak
electricity demand in order to come up with a full electricity demand distribution capable of managing
uncertainties in the grid system.
In order to achieve stationarity in the daily peak electricity demand (DPED), we apply a penalized
regression cubic smoothing spline to ensure the data is non-linearly detrended. The R package “evmix”
is used to estimate the thresholds using the bounded corrected kernel density plot. The non-linear
detrended datasets were divided into summer, spring, winter and autumn according to the calender
dates in the Southern Hemisphere for frequency analysis. The data is declustered using Ferro and
Segers automatic declustering method. The cluster maxima is extracted using the R package “evd”.
We fit Poisson GPD and stationary point process to the cluster maxima and the intensity function of
the point process which measures the frequency of occurrence of the daily peak electricity demand per
year is calculated for each dataset.
The formal goodness-of-fit test based on Cramer-Von Mises statistics and Anderson-Darling statistics
supported the null hypothesis that each dataset follow Poisson GPD (σ, ξ) at 5 percent level of
significance. The modelling framework, which is easily extensible to other peak load parameters, is
based on the assumption that peak power follows a Poisson process. The parameters of the developed
i
models were estimated using the Maximum Likelihood. The usual asymptotic properties underlying the
Poisson GPD were satisfied by the model. / NRF
|
Page generated in 0.0467 seconds