Spelling suggestions: "subject:"[een] EXTREME VALUE THEORY"" "subject:"[enn] EXTREME VALUE THEORY""
51 |
Computational Simulation and Machine Learning for Quality Improvement in Composites AssemblyLutz, Oliver Tim 22 August 2023 (has links)
In applications spanning across aerospace, marine, automotive, energy, and space travel domains, composite materials have become ubiquitous because of their superior stiffness-to-weight ratios as well as corrosion and fatigue resistance. However, from a manufacturing perspective, these advanced materials have introduced new challenges that demand the development of new tools. Due to the complex anisotropic and nonlinear material properties, composite materials are more difficult to model than conventional materials such as metals and plastics. Furthermore, there exist ultra-high precision requirements in safety critical applications that are yet to be reliably met in production. Towards developing new tools addressing these challenges, this dissertation aims to (i) build high-fidelity numerical simulations of composite assembly processes, (ii) bridge these simulations to machine learning tools, and (iii) apply data-driven solutions to process control problems while identifying and overcoming their shortcomings. This is accomplished in case studies that model the fixturing, shape control, and fastening of composite fuselage components. Therein, simulation environments are created that interact with novel implementations of modified proximal policy optimization, based on a newly developed reinforcement learning algorithm. The resulting reinforcement learning agents are able to successfully address the underlying optimization problems that underpin the process and quality requirements. / Doctor of Philosophy / Within the manufacturing domain, there has been a concerted effort to transition towards Industry 4.0. To a large degree, this term refers Klaus Schwab's vision presented at the World Economic Forum in 2015, in which he outlined fundamental systemic changes that would incorporate ubiquitous computing, artificial intelligence (AI), big data, and the internet-of-things (IoT) into all aspects of productive activities within the economy. Schwab argues that rapid change will be driven by fusing these new technologies in existing and emerging applications. However, this process has only just begun and there still exist many challenges to realize the promise of Industry 4.0. One such challenge is to create computer models that are not only useful during early design stages of a product, but that are connected to its manufacturing processes, thereby guiding and informing decisions in real-time. This dissertation explores such scenarios in the context of composite structure assembly in aerospace manufacturing. It aims to link computer simulations that characterize the assembly of product components with their physical counterparts, and provides data-driven solutions to control problems that cannot typically be solved without tedious trial-and-error approaches or expert knowledge.
|
52 |
Wireless Network Dimensioning and Provisioning for Ultra-reliable Communication: Modeling and AnalysisGomes Santos Goncalves, Andre Vinicius 28 November 2023 (has links)
A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services such as ultra-reliable low-latency communication (URLLC) and hyper-reliable low-latency communication (HRLLC), the staple mission-critical services in IMT-2020 (5G) and IMT-2023 (6G), for which reliable and resilient communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. A natural way of increasing reliability and reducing latency is to provision additional network resources to compensate for uncertainty in wireless networks caused by fading, interference, mobility, and time-varying network load, among others. Thus, an important step to enable mission-critical services is to identify and quantify what it takes to support ultra-reliable communication in mobile networks -- a process often referred to as dimensioning. This dissertation focuses on resource dimensioning, notably spectrum, for ultra-reliable wireless communication. This dissertation proposes a set of methods for spectrum dimensioning based on concepts from risk analysis, extreme value theory, and meta distributions. These methods reveal that each ``nine'' in reliability (e.g., five-nines in 99.999%) roughly translates into an order of magnitude increase in the required bandwidth. In ultra-reliability regimes, the required bandwidth can be in the order of tens of gigahertz, far beyond what is typically available in today's networks, making it challenging to provision resources for ultra-reliable communication. Accordingly, this dissertation also investigates alternative approaches to provide resources to enable ultra-reliable communication services in mobile networks. Particularly, this dissertation considers multi-operator network sharing and multi-connectivity as alternatives to make additional network resources available to enhance network reliability and proposes multi-operator connectivity sharing, which combines multi-operator network sharing with multi-connectivity. Our studies, based on simulations, real-world data analysis, and mathematical models, suggest that multi-operator connectivity sharing -- in which mobiles multi-connect to base stations of operators in a sharing arrangement -- can reduce the required bandwidth significantly because underlying operators tend to exhibit characteristics attractive to reliability, such as complementary coverage during periods of impaired connectivity, facilitating the support for ultra-reliable communication in future mobile networks. / Doctor of Philosophy / A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services in 5G and 6G, for which ultra-reliable communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. Reliability often comes at the cost of additional network resources to compensate for uncertainty in wireless networks. Thus, an important step to enable ultra-reliable communication is to identify and quantify what it takes to support mission-critical services in mobile networks -- a process often denoted as dimensioning. This dissertation focuses on spectrum dimensioning and proposes a set of methods to identify suitable spectrum bands and required bandwidth for ultra-reliable communication.
These methods reveal that the spectrum needs for ultra-reliable communication can be beyond what is typically available in today's networks, making it challenging to provide adequate resources to support ultra-reliable communication services in mobile networks. Alternatively, we propose multi-operator connectivity sharing: mobiles simultaneously connect to multiple base stations of different operators. Our studies suggest that multi-operator connectivity sharing can reduce the spectrum needs in ultra-reliability regimes significantly, being an attractive alternative to enable ultra-reliable communication in future mobile networks.
|
53 |
ESTIMATING PEAKING FACTORS WITH POISSON RECTANGULAR PULSE MODEL AND EXTREME VALUE THEORYZHANG, XIAOYI 27 September 2005 (has links)
No description available.
|
54 |
Actuarial modelling of extremal events using transformed generalized extreme value distributions and generalized pareto distributionsHan, Zhongxian 14 October 2003 (has links)
No description available.
|
55 |
Development of value at risk measures : towards an extreme value approachGanief, Moegamad Shahiem 12 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2001. / ENGLISH ABSTRACT: Commercial banks, investment banks, insurance companies, non-financial firms, and
pension funds hold portfolios of assets that may include stocks, bonds, currencies,
and derivatives. Each institution needs to quantify the amount of risk its portfolio is
exposed to in the course of a day, week, month, or year. Extreme events in financial
markets, such as the stock market crash of October 1987, are central issues in finance
and particularly in risk management and financial regulation.
A method called value at risk (VaR) can be used to estimate market risk. Value at risk
is a powerful measure of risk that is gaining wide acceptance amongst institutions for
the management of market risk. Value at Risk is an estimate of the largest lost that
a portfolio is likely to suffer during all but truly exceptional periods. More precisely,
the VaR is the maximum loss that an institution can be confident it would lose a
certain fraction of the time over a particular period.
The power of the concept is its generality. VaR measures are applicable to entire
portfolios - encompassing many asset categories and multiple sources of risk. As with
its power, the challenge of calculating VaR also stems from its generality. In order to
measure risk in a portfolio using VaR, some means must be found for determining a
return distribution for the portfolio.
There exists a wide range of literature on different methods of implementing VaR.
But, when one attempts to apply the results, several questions remain open. For
example, given a VaR measure, how can the risk manager test that the particular
measure at hand is appropriately specified? And secondly, given two different VaR
measures, how can the risk manager pick the best measure?
Despite the popularity of VaR for measuring market risk, no consensus has yet been reach as to the best method to implement this risk measure. The absence of consensus
is in part derived from the realization that each method currently in use has some
significant drawbacks.
The aim of this project is threefold: to introduce the reader to the concept of VaR;
present the theoretical basis for the general approaches to VaR computations; and to
introduce and apply Extreme Value Theory to VaR calculations.
The general approaches to VaR computation falls into three categories, namely, Analytic
(Parametric) Approach, Historical Simulation Approach, and Monte Carlo Simulation
Approach. Each of these approaches has its strengths and weaknesses, which
will study more closely.
The extreme value approach to VaR calculation is a relatively new approach. Since
most observed returns are central ones, traditional VaR methods tend to ignore extreme
events and focus on risk measures that accommodate the whole empirical distribution
of central returns. The danger of this approach is that these models are prone
to fail just when they are needed most - in large market moves, when institutions can
suffer very large losses.
The extreme value approach is a tool that attempts to provide the user with the best
possible estimate of the tail area of the distribution. Even in the absence of useful
historical data, extreme value theory provides guidance on the kind of distribution
that should be selected so that extreme risks are handled conservatively. As an
illustration, the extreme value method will be applied to a foreign exchange futures
contract. The validity of EVT to VaR calculations will be tested by examining the
data of the Rand/Dollar One Year Futures Contracts. An extended worked example
will be provided wherein which attempts to highlight the considerable strengths of
the methods as well as the pitfalls and limitations. These results will be compared to
VaR measures calculated using a GARCH(l,l) model. / AFRIKAANSE OPSOMMING: Handelsbanke, aksepbanke, assuransiemaatskappye, nie-finansiële instellings en pensioenfondse
beskik oor portefeuljes van finansiële bates soos aandele, effekte, geldeenhede
en afgeleides. Elke instelling moet die omvang kan bepaal van die risiko waaraan
die portefeulje blootgestel is in die loop van 'n dag, week, maand of jaar. Uitsonderlike
gebeure op finansiële markte, soos die ineenstorting van die aandelemark in Oktober
1987, is van besondere belang vir finansies en veral vir risikobestuur en finansiële
regulering.
'n Metode wat genoem word Waarde op Risiko (WoR), kan gebruik word om markverliese
te meet. WoR is 'n kragtige maatstaf vir risiko en word deur vele instellings gebruik
vir die bestuur van mark-risiko. Waarde op Risiko is 'n raming van die grootste
verlies wat 'n portefeulje moontlik kan ly gedurende enige tydperk, met uitsluiting
van werklik uitsonderlike tydperke. Van nader beskou, is WoR die maksimum verlies
wat 'n instelling kan verwag om gedurende 'n sekere tydperk binne 'n bepaalde
periode te ly.
Die waarde van die konsep lê in die algemene aard daarvan. WoR metings is van
toepassing op portefeuljes in dié geheel en dit omvat baie kategorieë bates en veelvuldige
bronne van risiko. Soos met die waarde van die konsep, hou die uitdaging om WoR
te bereken ook verband met die algemene aard van die konsep. Ten einde die risiko
te bepaal in 'n portefeulje waar WoR gebruik word, moet metodes gevind word waarvolgens
'n opbrengsverdeling vir die portefeulje vasgestel kan word.
Daar bestaan 'n groot verskeidenheid literatuur oor die verskillende metodes om WoR
te implementeer. Wanneer dit egter kom by die toepassing van die resultate, bly
verskeie vrae onbeantwoord. Byvoorbeeld, hoe kan die risikobestuurder aan die hand
van 'n gegewe WoR-maatstaf toets of die spesifieke maatstaf reg gespesifiseer is? Tweedens, hoe kan die risikobestuurder die beste maatstaf kies in die geval van twee
verskillende WoR-maatstawwe?
Ondanks die feit dat WoR algemeen gebruik word vir die meting van markrisiko, is
daar nog nie konsensus bereik oor die beste metode om hierdie benadering tot risikometing
te implementeer nie. Die feit dat daar nie konsensus bestaan nie, kan deels
daaraan toegeskryf word dat elkeen van die metodes wat tans gebruik word, ernstige
leemtes het.
Die doel van hierdie projek is om die konsep WoR bekend te stel, om die teoretiese
grondslag te lê vir die algemene benadering tot die berekening van WoR en om die
Ekstreme Waarde-teorie bekend te stel en toe te pas op WoR-berekenings.
Die algemene benadering tot die berekening van WoR word in drie kategorieë verdeel
naamlik die Analitiese (Parametriese) benadering, die Historiese simulasiebenadering
en die Monte Carlo-simulasiebenadering. Elkeen van die benaderings het sterk- en
swakpunte wat van nader ondersoek sal word.
Die Ekstreme Waarde-benadering tot WoR is 'n relatief nuwe benadering. Aangesien
die meeste opbrengste middelwaarde-gesentreer is, is tradisionele WoR-metodes
geneig om uitsonderlike gebeure buite rekening te laat en te fokus op risiko-maatstawwe
wat die hele empiriese verdeling van middelwaarde-gesentreerde opbrengste akkommodeer.
Die gevaar bestaan dan dat hierdie modelle geneig is om te faal juis wanneer
dit die meeste benodig word, byvoorbeeld in die geval van groot markverskuiwings
waartydens organisasies baie groot verliese kan ly.
Daar word beoog om met behulp van die Ekstreme Waarde-benadering aan die gebruiker
die beste moontlike skatting van die stert-area van die verdeling te gee. Selfs
in die afwesigheid van bruikbare historiese data verskaf die Ekstreme Waarde-teorie
riglyne ten opsigte van die aard van die verdeling wat gekies moet word, sodat uiterste
risiko's versigtig hanteer kan word. Ten einde hierdie metode te illustreer, word
dit in hierdie studie toegepas op 'n termynkontrak ten opsigte van buitelandse wisselkoerse.
Die geldigheid van die Ekstreme Waarde-teorie ten opsigte van WoR berekenings
word getoets deur die data van die Rand/Dollar Eenjaartermynkontrak
te bestudeer. 'n Volledig uitgewerkte voorbeeld word verskaf waarin die slaggate en
beperkings asook die talle sterkpunte van die model uitgewys word. Hierdie resultate
sal vergelyk word met 'n WoR-meting wat bereken is met die GARCH (1,1) model.
|
56 |
Stability of the Financial System: Systemic Dependencies between Bank and Insurance Sectors / Stability of the Financial System: Systemic Dependencies between Bank and Insurance SectorsProcházková, Jana January 2014 (has links)
The central issue of this thesis is investigating the eventuality of systemic break- downs in the international financial system through examining systemic depen- dence between bank and insurance sectors. Standard models of systemic risk often use correlation of stock returns to evaluate the magnitude of intercon- nectedness between financial institutions. One of the main drawbacks of this approach is that it is oriented towards observations occurring along the central part of the distribution and it does not capture the dependence structure of outlying observations. To account for that, we use methodology which builds on the Extreme Value Theory and is solely focused on capturing dependence in extremes. The analysis is performed using the data on stock prices of the EU largest banks and insurance companies. We study dependencies in the pre- crisis and post-crisis period. The objective is to discover which sector poses a higher systemic threat to the international financial stability. Also, we try to find empirical evidence about an increase in interconnections in recent post- crisis years. We find that in both examined periods systemic dependence in the banking sector is higher than in the insurance sector. Our results also in- dicate that extremal interconnections in the respective sectors increased,...
|
57 |
[en] EXTREME VALUE THEORY: VALUE AT RISK FOR VARIABLE-INCOME ASSETS / [pt] TEORIA DOS VALORES EXTREMOS: VALOR EM RISCO PARA ATIVOS DE RENDA VARIÁVELGUSTAVO LOURENÇO GOMES PIRES 26 June 2008 (has links)
[pt] A partir da década de 90, a metodologia de Valor em Risco
(VaR) se difundiu pelo mundo, tanto em instituições
financeiras quanto em não financeiras, como uma boa prática
de mensuração de riscos. Um dos fatos estilizados mais
pronunciados acerca das distribuições de retornos
financeiros diz respeito à presença de caudas pesadas. Isso
torna os modelos paramétricos tradicionais de
cálculo de Valor em Risco (VaR) inadequados para a estimação
de VaR de baixas probabilidades, dado que estes se baseiam
na hipótese de normalidade para as distribuições dos
retornos. Sendo assim, o objetivo do presente trabalho é
investigar o desempenho de modelos baseados na Teoria dos
Valores Extremos para o cálculo do VaR. Os resultados
indicam que os modelos baseados na Teoria dos Valores
Extremos são adequados para a modelagem das caudas, e
consequentemente para a estimação de Valor em Risco quando
os níveis de probabilidade de interesse são baixos. / [en] Since the 90 decade, the use of Value at Risk (VaR)
methodology has been disseminated among both financial and
non-financial institutions around the world, as a good
practice in terms of risks management. The existence of fat
tails is one of the striking stylized facts of financial
returns distributions. This fact makes the use of
traditional parametric models for Value at Risk (VaR)
estimation unsuitable for the estimation of low probability
events. This is because traditional models are based on the
conditional normality assumption for financial returns
distributions. The main purpose of this dissertation is to
investigate the performance of VaR models based on Extreme
Value Theory. The results indicates that Extreme Value
Theory based models are suitable for low probability
VaR estimation.
|
58 |
Local Likelihood Approach for High-Dimensional Peaks-Over-Threshold InferenceBaki, Zhuldyzay 14 May 2018 (has links)
Global warming is affecting the Earth climate year by year, the biggest difference being observable in increasing temperatures in the World Ocean. Following the long- term global ocean warming trend, average sea surface temperatures across the global tropics and subtropics have increased by 0.4–1◦C in the last 40 years. These rates become even higher in semi-enclosed southern seas, such as the Red Sea, threaten- ing the survival of thermal-sensitive species. As average sea surface temperatures are projected to continue to rise, careful study of future developments of extreme temper- atures is paramount for the sustainability of marine ecosystem and biodiversity. In this thesis, we use Extreme-Value Theory to study sea surface temperature extremes from a gridded dataset comprising 16703 locations over the Red Sea. The data were provided by Operational SST and Sea Ice Analysis (OSTIA), a satellite-based data system designed for numerical weather prediction. After pre-processing the data to account for seasonality and global trends, we analyze the marginal distribution of ex- tremes, defined as observations exceeding a high spatially varying threshold, using the Generalized Pareto distribution. This model allows us to extrapolate beyond the ob- served data to compute the 100-year return levels over the entire Red Sea, confirming the increasing trend of extreme temperatures. To understand the dynamics govern- ing the dependence of extreme temperatures in the Red Sea, we propose a flexible local approach based on R-Pareto processes, which extend the univariate Generalized Pareto distribution to the spatial setting. Assuming that the sea surface temperature varies smoothly over space, we perform inference based on the gradient score method
over small regional neighborhoods, in which the data are assumed to be stationary in space. This approach allows us to capture spatial non-stationarity, and to reduce the overall computational cost by taking advantage of distributed computing resources. Our results reveal an interesting extremal spatial dependence structure: in particular, from our estimated model, we conclude that significant extremal dependence prevails for distances up to about 2500 km, which roughly corresponds to the Red Sea length.
|
59 |
Extreme behavior and VaR of Short-term interest rate of TaiwanChiang, Ming-Chu 21 July 2008 (has links)
The current study empirically analyzes the extreme behavior and the impact of deregulation policies as well as financial turmoil on the extreme behavior of changes of Taiwan short term interest rate. A better knowledge of short-term interest rate properties, such as heavy tails, asymmetry, and uneven tail fatness between right and left tails, provide an insight to the extreme behavior of short-term interest rate as well as a more accurate estimation of interest risk. The predicting performances of filtered and unfiltered VaR (Value at risk) models are also examined to suggest the proper models for management of interest rate risk. By applying Extreme Value theory (EVT), tail behavior is analyzed and tested and the VaR based on parametric and non-parametric EVT models are calculated.The empirical findings show that, first, the distribution of change of rate are heavy-tailed indicating that the actual risk would be underestimated based on normality assumption. Second, the unconditional distribution is consistent with the heavier-tailed distributions such as ARCH process or Student¡¦t. Third, the right tail of distribution of change of rate are significantly heavier than the left one pointing out that the probabilities and magnitudes of rise in rate could be higher than those of drop in rate. Fourth, the amount of tail-fatness in tail of distribution of change of rate increase after 1999 and the vital factors to cause structural break in tail index are the interest rate policies taken by central bank of Taiwan instead of the deregulation policies in money market. Fifth, based on the two break points found in tail index of right and left tail, long sample of CP rates should not be treated as samples from a single distribution. Sixth, the dependent and heteroscedastic properties of data series should be considered in applying EVT to improve accuracy of VaR forecasts. Finally, EVT models predict VaR accurately before 2001 and the benchmark model, HS and GARCH, generally are superior to EVT models after 2001. Among EVT models, MRE and CHE are relative consistent and reliable in VaR prediction.
|
60 |
An assessment of uncertainties and limitations in simulating tropical cyclone climatology and future changesSuzuki-Parker, Asuka 04 May 2011 (has links)
The recent elevated North Atlantic hurricane activity has generated considerable interests in the interaction between tropical cyclones (TCs) and climate change. The possible connection between TCs and the changing climate has been indicated by observational studies based on historical TC records; they indicate emerging trends in TC frequency and intensity in some TC basins, but the detection of trends has been hotly debated due to TC track data issues. Dynamical climate modeling has also been applied to the problem, but brings its own set of limitations owing to limited model resolution and uncertainties.
The final goal of this study is to project the future changes of North Atlantic TC behavior with global warming for the next 50 years using the Nested Regional Climate Model (NRCM). Throughout the course of reaching this goal, various uncertainties and limitations in simulating TCs by the NRCM are identified and explored.
First we examine the TC tracking algorithm to detect and track simulated TCs from model output. The criteria and thresholds used in the tracking algorithm control the simulated TC climatology, making it difficult to objectively assess the model's ability in simulating TC climatology. Existing tracking algorithms used by previous studies are surveyed and it is found that the criteria and thresholds are very diverse. Sensitivity of varying criteria and thresholds in TC tracking algorithm to simulated TC climatology is very high, especially with the intensity and duration thresholds. It is found that the commonly used criteria may not be strict enough to filter out intense extratropical systems and hybrid systems. We propose that a better distinction between TCs and other low-pressure systems can be achieved by adding the Cyclone Phase technique.
Two sets of NRCM simulations are presented in this dissertation: One in the hindcasting mode, and the other with forcing from the Community Climate System Model (CCSM) to project into the future with global warming. Both of these simulations are assessed using the tracking algorithm with cyclone phase technique.
The NRCM is run in a hindcasting mode for the global tropics in order to assess its ability to simulate the current observed TC climatology. It is found that the NRCM is capable of capturing the general spatial and temporal distributions of TCs, but tends to overproduce TCs particularly in the Northwest Pacific. The overpredction of TCs is associated with the overall convective tendency in the model added with an outstanding theory of wave energy accumulation leading to TC genesis. On the other hand, TC frequency in the tropical North Atlantic is under predicted due to the lack of moist African Easterly Waves. The importance of high-resolution is shown with the additional simulation with two-way nesting.
The NRCM is then forced by the CCSM to project the future changes in North Atlantic TCs. An El Nino-like SST bias in the CCSM induced a high vertical wind shear in tropical North Atlantic, preventing TCs from forming in this region. A simple bias correction method is applied to remove this bias. The model projected an increase both in TC frequency and intensity owing to enhanced TC genesis in the main development region, where the model projects an increased favorability of large-scale environment for TC genesis. However, the model is not capable of explicitly simulating intense (Category 3-5) storms due to the limited model resolution. To extrapolate the prediction to intense storms, we propose a hybrid approach that combines the model results and a statistical modeling using extreme value theory. Specifically, the current observed TC intensity is statistically modeled with the General Pareto distribution, and the simulated intensity changes from the NRCM are applied to the statistical model to project the changes in intense storms. The results suggest that the occurrence of Category 5 storms may be increased by approximately 50% by 2055.
|
Page generated in 0.0356 seconds