• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 140
  • 21
  • 21
  • 13
  • 8
  • 7
  • 6
  • 6
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 268
  • 268
  • 168
  • 55
  • 38
  • 33
  • 31
  • 30
  • 26
  • 24
  • 22
  • 22
  • 21
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Modelling flood heights of the Limpopo River at Beitbridge Border Post using extreme value distributions

Kajambeu, Robert January 2016 (has links)
MSc (Statistics) / Department of Statistics / Haulage trucks and cross border traders cross through Beitbridge border post from landlocked countries such as Zimbabwe and Zambia for the sake of trading. Because of global warming, South Africa has lately been experiencing extreme weather patterns in the form of very high temperatures and heavy rainfall. Evidently, in 2013 tra c could not cross the Limpopo River because water was owing above the bridge. For planning, its important to predict the likelihood of such events occurring in future. Extreme value models o er one way in which this can be achieved. This study identi es suitable distributions to model the annual maximum heights of Limpopo river at Beitbridge border post. Maximum likelihood method and the Bayesian approach are used for parameter estimation. The r -largest order statistics was also used in this dissertation. For goodness of t, the probability and quantile- quantile plots are used. Finally return levels are calculated from these distributions. The dissertation has revealed that the 100 year return level is 6.759 metres using the maximum likelihood and Bayesian approaches to estimate parameters. Empirical results show that the Fr echet class of distributions ts well the ood heights data at Beitbridge border post. The dissertation contributes positively by informing stakeholders about the socio- economic impacts that are brought by extreme flood heights for Limpopo river at Beitbridge border post
242

Modelování kybernetického rizika pomocí kopula funkcí / Cyber risk modelling using copulas

Spišiak, Michal January 2020 (has links)
Cyber risk or data breach risk can be estimated similarly as other types of operational risk. First we identify problems of cyber risk models in existing literature. A large dataset consisting of 5,713 loss events enables us to apply extreme value theory. We adopt goodness of fit tests adjusted for distribution functions with estimated parameters. These tests are often overlooked in the literature even though they are essential for correct results. We model aggregate losses in three different industries separately and then we combine them using a copula. A t-test reveals that potential one-year global losses due to data breach risk are larger than the GDP of the Czech Republic. Moreover, one-year global cyber risk measured with a 99% CVaR amounts to 2.5% of the global GDP. Unlike others we compare risk measures with other quantities which allows wider audience to understand the magnitude of the cyber risk. An estimate of global data breach risk is a useful indicator not only for insurers, but also for any organization processing sensitive data.
243

Mortality and Death

Parmer, Berit 19 April 2022 (has links)
Der Tod einer Person (ein Ereignis) und ihre Sterblichkeit (eine Eigenschaft) sind nicht dasselbe, hängen aber klarerweise zusammen. Angenommen der Tod einer Person ist schlecht für diese – was bedeutet dies dann für die Bewertung ihrer Sterblichkeit? Um diese Frage zu beantworten muss unterschieden werden zwischen zwei verschiedenen Arten von Sterblichkeit: Die kontingente Sterblichkeit („sterben können“) ist die dispositionale Eigenschaft eines Lebewesens unter bestimmten Umständen zu sterben. Der Tod des Lebewesens ist die Manifestation dieser Disposition. Die notwendige Sterblichkeit andererseits („sterben müssen“) ist die (Meta-)Eigenschaft eines Lebwesens eine begrenzte Lebenserwartung zu haben – das heißt, eine Disposition zu sterben zu haben, die notwendigerweise irgendwann manifestiert wird. Der werttheoretische Zusammenhang zwischen Tod und Sterblichkeit ergibt sich plausiblerweise aus den Rollen, die kontingente und notwendige Sterblichkeit beim Zustandekommen des Todes einer Person spielen: Es zeigt sich, dass die Disposition zu sterben den Tod der Person ermöglicht. Dadurch macht sie einen kausal relevanten Unterschied im Auftreten eines für diese Person schlechten Ereignisses und beeinflusst so ihr Wohlergehen. Deshalb ist die kontingente Sterblichkeit schlecht für die Person – und diese Schlechtigkeit erbt sie von dem Ereignis des Todes. Steht dieses Ereignis allerdings noch nicht fest, erbt die kontingente Sterblichkeit stattdessen einen Teil des erwarteten Unwerts des Todes. Die notwendige Sterblichkeit andererseits erbt einen Teil des generellen Unwerts des Todes (verstanden als Ereignistyp), weil sie das Eintreten eines solchen Ereignisses vorwegnimmt, indem sie garantiert, dass ein Ereignis dieses Typs eintreten muss. Gleichzeitig scheint die notwendige Sterblichkeit ihrerseits die Evaluation des Todes (und somit auch der kontingenten Sterblichkeit) zu beeinflussen: Der Tod einer Person wirkt weniger tragisch, wenn sie notwendig sterblich ist. / The death of a person (an event) and her mortality (a property) are not the same but nevertheless clearly related. Assuming that a person’s death is bad for her, one may ask what this would mean for the evaluation of her mortality. To answer this question, one must distinguish between two different kinds of mortality: Contingent mortality (“being able to die”) is the dispositional property of a living being to die under certain circumstances. The death of the being is the manifestation of this disposition. Necessary mortality on the other hand (“having to die”) is the (meta-)property of a living being to have a limited life-expectancy, that is, to have a disposition to die that is necessarily manifested eventually. The evaluative connection between death and mortality can plausibly be derived from the roles that a person’s contingent and necessary mortality play in the occurrence of her death: It turns out that the disposition to die is an enabling condition for the person’s death. This means that it makes a causally relevant difference in the occurrence of an event that is bad for the person and thereby genuinely and negatively affects her wellbeing. Therefore, her contingent mortality is bad for the person – and this badness is derived from the badness of the event of her death. If this event is not yet settled, the contingent mortality inherits a part of the expected disvalue of the person’s death instead. Her necessary mortality on the other hand inherits part of the general disvalue of the person’s death (understood as a type of event) because it anticipates the occurrence of such an event by guaranteeing that an event of this type must occur. At the same time, the presence of a person’s necessary mortality seems to affect the evaluation of her death (and thereby also her contingent mortality): A person’s death appears less tragic if she is necessarily mortal.
244

Farmers' collective action and agricultural transformation in Ethiopia

Etenesh Bekele Asfaw 08 1900 (has links)
Rural Ethiopia rolled-out a program for the establishment of farmers’ collective action groups known as ‘Farmers’ Development Groups’ (FDGs), in 2007, based on presumed common interest of smallholder farmers. Although the government trusts that FDGs fetch fast and widescale agricultural transformation as part of the participatory agricultural extension system, systematic study and evidence on what motivates smallholder farmers to act collectively, the group dynamics, long term impact and transformative potential of the agricultural extension groups is scarce. Using the expectancy-value theory in social-psychology, this study explores what drives smallholders to act collectively; their participation level and benefits in groups, particularly for women and the youth; and the extent to which farmers’ groups attain intended agricultural transformation goals of productivity and commercialization. The study collected a mix of qualitative and quantitative data in 2016, through 46 key informant interviews; 8 focus group discussions with farmers, and a survey of randomly selected 120 smallholder farmers (30 percent women) in four sample woredas (districts) of Ethiopia. The findings of the study are drawn through a content analysis, and descriptive and correlation analysis of the qualitative and quantitative data, respectively. The study findings show that social identity, and not ‘common interest’ motivates smallholder farmers to join and participate in FDGs. The study provides evidence that participation in FDGs enhances smallholder farmers’ adoption and use of agricultural technologies, where 96 and 84 percent of the farmers who received extension messages in the group on crop and livestock production, respectively, applied the message. Consequently, by 2015 more than 85 percent of the survey respondent farmers reported above 10 percent increase in crop and livestock productivity. Nevertheless, the nature of the incremental changes brought by the collective actions are not transformative, nor sustainable. Extension groups have limited contribution to commercialization of smallholders, where only 20 percent of the FDG members participate in output marketing. More so, FDGs avail limited collective opportunity for the landless youth, and married female farmers in a rural society where difference in power, status and privilege prevail. It also limits deviation of thought among the rural community. Limited access to inputs and technology; large family size; limited access to farm land; over dependence of the extension system on ‘model’ farmers and public extension agents, and poorly designed sustainability features bound the transformative potential of FDGs. The study forwards a set of five recommendations to unleash the potential of FDGs: reconsider the group design to be identity congruent; ensure inclusiveness for young and female farmers; empower and motivate voluntary group leaders; encourage collective marketing and; invest in sustainability features of the group. / Development Studies / Ph. D. (Development Studies)
245

Development and application of a multi-criteria decision-support framework for planning rural energy supply interventions in low-income households in South Africa

Dzenga, Bruce 25 August 2022 (has links) (PDF)
Problems in the public policy decision-making environments are typically complex and continuously evolve. In a resource-constrained environment, several alternatives, criteria, and conflicting objectives must be considered. As a result, solutions to these types of problems cannot be modelled solely using single-criteria techniques. It has been observed that most techniques used to shape energy policy and planning either produce sub-optimal solutions or use strong assumptions about the preferences of decision-maker(s). This difficulty creates a compelling need to develop novel techniques that can handle several alternatives, multiple criteria and conflicting objectives to support public sector decision-making processes. First, the study presents a novel scenario-based multi-objective optimisation framework based on the augmented Chebychev goal programming (GP) technique linked to a value function for analysing a decision environment underlying energy choice among low-income households in isolated rural areas and informal urban settlements in South Africa. The framework developed includes a multi-objective optimisation technique that produced an approximation of a Pareto front linked to an a priori aggregation function and a value function to select the best alternatives. Second, the study used this model to demonstrate the benefits of applying the framework to a previously unknown subject in public policy: a dynamic multi-technology decision problem under uncertainty involving multiple stakeholders and conflicting objectives. The results obtained suggest that while it is cost-optimal to pursue electrification in conjunction with other short-term augmentation solutions to meet South Africa's universal electrification target, sustainable energy access rates among low-income households can be achieved by increasing the share of clean energy generation technologies in the energy mix. This study, therefore, challenges the South African government's position on pro-poor energy policies and an emphasis on grid-based electrification to increase energy access. Instead, the study calls for a portfolio-based intervention. The study advances interventions based on micro-grid electrification made up of solar photovoltaics (PV), solar with storage, combined cycle gas turbine (CCGT) and wind technologies combined with either bioethanol fuel or liquid petroleum gas (LPG). The study has demonstrated that the framework developed can benefit public sector decision-makers in providing a balanced regime of technical, financial, social, environmental, public health, political and economic aspects in the decision-making process for planning energy supply interventions for low-income households. The framework can be adapted to a wide range of energy access combinatorial problems and in countries grappling with similar energy access challenges.
246

URBAN HIGH SCHOOL STUDENTS’ MOTIVATION IN FOOD SYSTEMS STEM PROJECTS

Sarah Lynne Joy Thies (15460442) 15 May 2023 (has links)
<p>  </p> <p>Food system STEM projects have the capacity to motivate high school students in urban schools. This study explored food as a context to engage students because everyone interacts with food on a daily basis and has had cultural experiences related to food. An integrated STEM approach in combination with a systems thinking approach challenged students to make transdisciplinary connections, view problems from different perspectives, analyze complex relationships, and develop 21st-century and career skills (Hilimire et al., 2014; Nanayakkara et al., 2017). The purpose of this study was to describe and explain the relevance students perceive in Ag+STEM content by measuring high school students' self-efficacy, intrinsic value, attainment value, cost value, and utility value after participating in a food system STEM project. The study was informed by Eccles and Wigfield’s (2020) Situated Expectancy Value Theory. The convenience sample of this study was comprised of high school students from metropolitan area schools. High school students completed a food system STEM project with a food system context. Quantitative data was collected using the developed Food System Motivation questionnaire. Data were collected through a retrospective pre-test and a post-test. Descriptive statistics were used to analyze the data including means and standard deviations. Relationships were explored by calculating correlations.</p> <p>There were four conclusions from this study. First, high school students were somewhat interested, felt it was important to do well, and agreed there were costs regarding participation in the food system STEM project. Second, high school students reported higher personal and local utility value motivation after completing the food system STEM project. Third, high school students were somewhat self-efficacious in completing the project tasks and completing the project tasks informed by their cultural identity and experiences. Fourth, intrinsic value and attainment value motivation (independent variables) were related to personal and local utility value motivation and project and cultural self-efficacy motivation (dependent variables). Implications for practice and recommendations for future research were discussed.</p>
247

Generating Extreme Value Distributions in Finance using Generative Adversarial Networks / Generering av Extremvärdesfördelningar inom Finans med hjälp av Generativa Motstridande Nätverk

Nord-Nilsson, William January 2023 (has links)
This thesis aims to develop a new model for stress-testing financial portfolios using Extreme Value Theory (EVT) and General Adversarial Networks (GANs). The current practice of risk management relies on mathematical or historical models, such as Value-at-Risk and expected shortfall. The problem with historical models is that the data which is available for very extreme events is limited, and therefore we need a method to interpolate and extrapolate beyond the available range. EVT is a statistical framework that analyzes extreme events in a distribution and allows such interpolation and extrapolation, and GANs are machine-learning techniques that generate synthetic data. The combination of these two areas can generate more realistic stress-testing scenarios to help financial institutions manage potential risks better. The goal of this thesis is to develop a new model that can handle complex dependencies and high-dimensional inputs with different kinds of assets such as stocks, indices, currencies, and commodities and can be used in parallel with traditional risk measurements. The evtGAN algorithm shows promising results and is able to mimic actual distributions, and is also able to extrapolate data outside the available data range. / Detta examensarbete handlar om att utveckla en ny modell för stresstestning av finansiella portföljer med hjälp av extremvärdesteori (EVT) och Generative Adversarial Networks (GAN). Dom modeller för riskhantering som används idag bygger på matematiska eller historiska modeller, som till exempel Value-at-Risk och Expected Shortfall. Problemet med historiska modeller är att det finns begränsat med data för mycket extrema händelser. EVT är däremot en del inom statistisk som analyserar extrema händelser i en fördelning, och GAN är maskininlärningsteknik som genererar syntetisk data. Genom att kombinera dessa två områden kan mer realistiska stresstestscenarier skapas för att hjälpa finansiella institutioner att bättre hantera potentiella risker. Målet med detta examensarbete är att utveckla en ny modell som kan hantera komplexa beroenden i högdimensionell data med olika typer av tillgångar, såsom aktier, index, valutor och råvaror, och som kan användas parallellt med traditionella riskmått. Algoritmen evtGAN visar lovande resultat och kan imitera verkliga fördelningar samt extrapolera data utanför tillgänglig datamängd.
248

Pricing and Modeling Heavy Tailed Reinsurance Treaties - A Pricing Application to Risk XL Contracts / Prissättning och modellering av långsvansade återförsäkringsavtal - En prissättningstillämpning på Risk XL kontrakt

Abdullah Mohamad, Ormia, Westin, Anna January 2023 (has links)
To estimate the risk of a loss occurring for insurance takers is a difficult task in the insurance industry. It is an even more difficult task to price the risk for reinsurance companies which insures the primary insurers. Insurance that is bought by an insurance company, the cedent, from another insurance company, the reinsurer, is called treaty reinsurance. This type of reinsurance is the main focus in this thesis. A very common risk to insure, is the risk of fire in municipal and commercial properties which is the risk that is priced in this thesis. This thesis evaluates Länsförsäkringar AB's current pricing model which calculates the risk premium for Risk XL contracts. The goal of this thesis is to find areas of improvement for tail risk pricing. The risk premium can be calculated commonly by using one of three different types of pricing models, experience rating, exposure rating and frequency-severity rating. This thesis focuses on frequency-severity pricing, which is a model that assumes independence between the frequency and the severity of losses, and therefore splits the two into separate models. This is a very common model used when pricing Risk XL contracts. The risk premium is calculated with the help of loss data from two insurance companies, from a Norwegian and a Finnish insurance company. The main focus of this thesis is to price the risk with the help of extreme value theory, mainly with the method of moments method to model the frequency of losses, and peaks over threshold model to model the severity of the losses. In order to model the estimated frequency of losses by using the method of moments method, two distributions are compared, the Poisson and the negative binomial distribution. There are different distributions that can be used to model the severity of losses. In order to evaluate which distribution is optimal to use, two different Goodness of Fit tests are applied, the Kolmogorov-Smirnov and the Anderson-Darling test. The Peaks over threshold model is a model that can be used with the Pareto distribution. With the help of the Hill estimator we are able to calculate a threshold $u$, which regulates the tail of the Pareto curve. To estimate the rest of the ingoing parameters in the generalized Pareto distribution, the maximum likelihood and the least squares method are used. Lastly, the bootstrap method is used to estimate the uncertainty in the price which was calculated with the help of the estimated parameters. From this, empirical percentiles are calculated and set as guidelines to where the risk premium should lie between, in order for both the data sets to be considered fairly priced. / Att uppskatta risken för en skada ska inträffa för försäkringstagarna är svår uppgift i försäkringsbranschen. Det är en ännu svårare uppgift är att prissätta risken för återförsäkringsbolag som försäkrar direktförsäkrarna. Den försäkringen som köps av direkförsäkrarna, cedenten, från återförsäkrarna kallas treaty återförsäkring. Denna typ av återförsäkring är den som behandlas i denna avhandlig. En vanlig risk att prisätta är brandrisken för kommunala och industriella byggnader, vilket är risken som prissätts i denna avhandlnig. Denna avhandling utvärderar Länsförsäkringar AB's nuvarande prissättning som beräknar riskpremien för Risk XL kontrakt.Målet med denna avhandling är att hitta förbättringsområden för långsvansad affär. Riskpremien kan beräknas med hjälp av tre vanliga typer av prissättningsmodeller, experience rating, exposure rating och frequency-severity raring. Denna tes fokuserar endast på frequency-severity rating, vilket är en modell som antar att frekevensen av skador och storleken av de är oberoende, de delas därmed upp de i separata modeller. Detta är en väldigt vanlig modell som används vid prissättning av Risk XL kontrakt.Riskpremien beräknas med hjälp av skadedata från två försäkringsbolag, ett norskt och ett finskt försäkringsbolag.Det huvudsakliga fokuset i denna avhandling är att prissätta risken med hjälp av extremevärdesteori, huvudsakligen med hjälp av momentmetoden för att modellera frekvensen av skador och peaks over threshold metoden för att modellera storleken av de skadorna.För att kunna modellera den förväntade frekvensen av skador med hjälp av moment metoden så jämförs två fördelingar, Poissonfördelingen och den negativa binomialfördelningen. Det finns ett antal fördelningar som kan användas för att modellera storleken av skadorna. För att kunna avgöra vilken fördeling som är bäst att använda så har två olika Goodness of Fit test applicerats, Kolmogorov-Smirnov och Anderson-Darling testet.Peaks over threhsold modellen är en modell som kan användas med Paretofördelningen. Med hjälp av Hillestimatorn så beräknas en tröskel $u$ som regulerar paretokurvans uteseende. För att beräkna de resterande parametrarna i den generaliserade Paretofördelningen används maximum likliehood och minsta kvadratmetoden. Slutligen används bootstrap metoden för att skatta osäkerheten i risk premien som satts med hjälp av de skattade parametrarna. Utifrån den metoden så skapas percentiler som blir en riktlinje för vart risk premien bör ligga för de datasetten för att kunna anses vara rättvist prissatt.
249

Understanding the Public Value of Four-Year Colleges and Universities in Ohio

Kuhr, Brittanie E. January 2022 (has links)
No description available.
250

Applying Peaks-Over-Threshold for Increasing the Speed of Convergence of a Monte Carlo Simulation / Peaks-Over-Threshold tillämpat på en Monte Carlo simulering för ökad konvergenshastighet

Jakobsson, Eric, Åhlgren, Thor January 2022 (has links)
This thesis investigates applying the semiparametric method Peaks-Over-Threshold on data generated from a Monte Carlo simulation when estimating the financial risk measures Value-at-Risk and Expected Shortfall. The goal is to achieve a faster convergence than a Monte Carlo simulation when assessing extreme events that symbolise the worst outcomes of a financial portfolio. Achieving a faster convergence will enable a reduction of iterations in the Monte Carlo simulation, thus enabling a more efficient way of estimating risk measures for the portfolio manager.  The financial portfolio consists of US life insurance policies offered on the secondary market, gathered by our partner RessCapital. The method is evaluated on three different portfolios with different defining characteristics.  In Part I an analysis of selecting an optimal threshold is made. The accuracy and precision of Peaks-Over-Threshold is compared to the Monte Carlo simulation with 10,000 iterations, using a simulation of 100,000 iterations as the reference value. Depending on the risk measure and the percentile of interest, different optimal thresholds are selected.  Part II presents the result with the optimal thresholds from Part I. One can conclude that Peaks-Over-Threshold performed significantly better than a Monte Carlo simulation for Value-at-Risk with 10,000 iterations. The results for Expected Shortfall did not achieve a clear improvement in terms of precision, but it did show improvement in terms of accuracy.  Value-at-Risk and Expected Shortfall at the 99.5th percentile achieved a greater error reduction than at the 99th. The result therefore aligned well with theory, as the more "rare" event considered, the better the Peaks-Over-Threshold method performed.  In conclusion, the method of applying Peaks-Over-Threshold can be proven useful when looking to reduce the number of iterations since it do increase the convergence of a Monte Carlo simulation. The result is however dependent on the rarity of the event of interest, and the level of precision/accuracy required. / Det här examensarbetet tillämpar metoden Peaks-Over-Threshold på data genererat från en Monte Carlo simulering för att estimera de finansiella riskmåtten Value-at-Risk och Expected Shortfall. Målet med arbetet är att uppnå en snabbare konvergens jämfört med en Monte Carlo simulering när intresset är s.k. extrema händelser som symboliserar de värsta utfallen för en finansiell portfölj. Uppnås en snabbare konvergens kan antalet iterationer i simuleringen minskas, vilket möjliggör ett mer effektivt sätt att estimera riskmåtten för portföljförvaltaren.  Den finansiella portföljen består av amerikanska livförsäkringskontrakt som har erbjudits på andrahandsmarknaden, insamlat av vår partner RessCapital. Metoden utvärderas på tre olika portföljer med olika karaktär.  I Del I så utförs en analys för att välja en optimal tröskel för Peaks-Over-Threshold. Noggrannheten och precisionen för Peaks-Over-Threshold jämförs med en Monte Carlo simulering med 10,000 iterationer, där en Monte Carlo simulering med 100,000 iterationer används som referensvärde. Beroende på riskmått samt vilken percentil som är av intresse så väljs olika trösklar.  I Del II presenteras resultaten med de "optimalt" valda trösklarna från Del I. Peaks-over-Threshold påvisade signifikant bättre resultat för Value-at-Risk jämfört med Monte Carlo simuleringen med 10,000 iterationer. Resultaten för Expected Shortfall påvisade inte en signifikant förbättring sett till precision, men visade förbättring sett till noggrannhet.  För både Value-at-Risk och Expected Shortfall uppnådde Peaks-Over-Threshold en större felminskning vid 99.5:e percentilen jämfört med den 99:e. Resultaten var därför i linje med de teoretiska förväntningarna då en högre percentil motsvarar ett extremare event.  Sammanfattningsvis så kan metoden Peaks-Over-Threshold vara användbar när det kommer till att minska antalet iterationer i en Monte Carlo simulering då resultatet visade att Peaks-Over-Threshold appliceringen accelererar Monte Carlon simuleringens konvergens. Resultatet är dock starkt beroende av det undersökta eventets sannolikhet, samt precision- och noggrannhetskravet.

Page generated in 0.0547 seconds