Spelling suggestions: "subject:"extremevalue"" "subject:"extreme·value""
61 |
Mnohorozměrné modely extrémních hodnot a jejich aplikace v hydrologii / Multivariate extreme value models and their application in hydrologyDrápal, Lukáš January 2014 (has links)
Present thesis deals with the multivariate extreme value theory. First, concepts of modelling block maxima and threshold excesses in the univariate case are reviewed. In the multivariate setting the point process approach is chosen to model dependence. The dependence structure of multivariate extremes is provided by a spectral measure or an exponent function. Models for asymptotically dependent variables are provided. A construction principle from Ballani and Schlather (2011) is discussed. Based on this discussion the pairwise beta model introduced by Cooley et al. (2010) is modified to provide higher flexibility. Models are applied to data from nine hydrological stations from northern Moravia previously analysed by Jarušková (2009). Usage of the new pairwise beta model is justified as it brought a substantial improvement of log-likelihood. Models are also compared with Bayesian model selection introduced by Sabourin et al. (2013). Powered by TCPDF (www.tcpdf.org)
|
62 |
Statistics of Multivariate Extremes with Applications in Risk ManagementHerrera, Rodrigo 06 July 2009 (has links)
The contributions of this thesis have mainly a dual purpose: introducing several multivariate statistical methodologies where in the major of the cases only stationary of the random variables is assumed, and also highlight some of the applied problems in risk management where extreme value theory may play a role. Mostly every chapter is selfcontained, they have its own more detailed introduction and short conclusion. / Die Kontributionen von dieser Dissertation haben ein doppeltes Ziel: die Darstellung von vielen multivariaten statistischen Verfahren, wobei in der Mehrheit der Fälle nur Stationarität von den Zufallsvariablen angenommen wurde, und die Anwendungen in Risikomanagement in welchem Extremwerttheorie eine wichtige Rolle spielen könnte. Die Struktur der Arbeit ist eigenständig, mit einer detaillierten Einführung und kurzen Zusammenfassung in jedem Kapitel.
|
63 |
Estimating expected shortfall using an unconditional peaks-over-threshold method under an extreme value approachWahlström, Rikard January 2021 (has links)
Value-at-Risk (VaR) has long been the standard risk measure in financial risk management. However, VaR suffers from critical shortcomings as a risk measure when it comes to quantifying the most severe risks, which was made especially apparent during the financial crisis of 2007–2008. An alternative risk measure addressing the shortcomings of VaR known as expected shortfall (ES) is gaining popularity and is set to replace VaR as the standard measure of financial risk. This thesis introduces how extreme value theory can be applied in estimating ES using an unconditional peaks-over-threshold method. This includes giving an introduction to the theoretical foundations of the method. An application of this method is also performed on five different assets. These assets are chosen to serve as a proxy for the more broad asset classes of equity, fixed income, currencies, commodities and cryptocurrencies. In terms of ES, we find that cryptocurrencies is the riskiest asset and fixed income the safest.
|
64 |
Apprentissage de structures dans les valeurs extrêmes en grande dimension / Discovering patterns in high-dimensional extremesChiapino, Maël 28 June 2018 (has links)
Nous présentons et étudions des méthodes d’apprentissage non-supervisé de phénomènes extrêmes multivariés en grande dimension. Dans le cas où chacune des distributions marginales d’un vecteur aléatoire est à queue lourde, l’étude de son comportement dans les régions extrêmes (i.e. loin de l’origine) ne peut plus se faire via les méthodes usuelles qui supposent une moyenne et une variance finies. La théorie des valeurs extrêmes offre alors un cadre adapté à cette étude, en donnant notamment une base théorique à la réduction de dimension à travers la mesure angulaire. La thèse s’articule autour de deux grandes étapes : - Réduire la dimension du problème en trouvant un résumé de la structure de dépendance dans les régions extrêmes. Cette étape vise en particulier à trouver les sous-groupes de composantes étant susceptible de dépasser un seuil élevé de façon simultané. - Modéliser la mesure angulaire par une densité de mélange qui suit une structure de dépendance déterminée à l’avance. Ces deux étapes permettent notamment de développer des méthodes de classification non-supervisée à travers la construction d’une matrice de similarité pour les points extrêmes. / We present and study unsupervised learning methods of multivariate extreme phenomena in high-dimension. Considering a random vector on which each marginal is heavy-tailed, the study of its behavior in extreme regions is no longer possible via usual methods that involve finite means and variances. Multivariate extreme value theory provides an adapted framework to this study. In particular it gives theoretical basis to dimension reduction through the angular measure. The thesis is divided in two main part: - Reduce the dimension by finding a simplified dependence structure in extreme regions. This step aim at recover subgroups of features that are likely to exceed large thresholds simultaneously. - Model the angular measure with a mixture distribution that follows a predefined dependence structure. These steps allow to develop new clustering methods for extreme points in high dimension.
|
65 |
Optimization under Uncertainty with Applications in Data-driven Stochastic Simulation and Rare-event EstimationZhang, Xinyu January 2022 (has links)
For many real-world problems, optimization could only be formulated with partial information or subject to uncertainty due to reasons such as data measurement error, model misspecification, or that the formulation depends on the non-stationary future. It thus often requires one to make decisions without knowing the problem's full picture. This dissertation considers the robust optimization framework—a worst-case perspective—to characterize uncertainty as feasible regions and optimize over the worst possible scenarios. Two applications in this worst-case perspective are discussed: stochastic estimation and rare-event simulation.
Chapters 2 and 3 discuss a min-max framework to enhance existing estimators for simulation problems that involve a bias-variance tradeoff. Biased stochastic estimators, such as finite-differences for noisy gradient estimation, often contain parameters that need to be properly chosen to balance impacts from the bias and the variance. While the optimal order of these parameters in terms of the simulation budget can be readily established, the precise best values depend on model characteristics that are typically unknown in advance. We introduce a framework to construct new classes of estimators, based on judicious combinations of simulation runs on sequences of tuning parameter values, such that the estimators consistently outperform a given tuning parameter choice in the conventional approach, regardless of the unknown model characteristics. We argue the outperformance via what we call the asymptotic minimax risk ratio, obtained by minimizing the worst-case asymptotic ratio between the mean square errors of our estimators and the conventional one, where the worst case is over any possible values of the model unknowns. In particular, when the minimax ratio is less than 1, the calibrated estimator is guaranteed to perform better asymptotically. We identify this minimax ratio for general classes of weighted estimators and the regimes where this ratio is less than 1. Moreover, we show that the best weighting scheme is characterized by a sum of two components with distinct decay rates. We explain how this arises from bias-variance balancing that combats the adversarial selection of the model constants, which can be analyzed via a tractable reformulation of a non-convex optimization problem.
Chapters 4 and 5 discuss extreme event estimation using a distributionally robust optimization framework. Conventional methods for extreme event estimation rely on well-chosen parametric models asymptotically justified from extreme value theory (EVT). These methods, while powerful and theoretically grounded, could however encounter difficult bias-variance tradeoffs that exacerbates especially when data size is too small, deteriorating the reliability of the tail estimation. The chapters study a framework based on the recently surging literature of distributionally robust optimization. This approach can be viewed as a nonparametric alternative to conventional EVT, by imposing general shape belief on the tail instead of parametric assumption and using worst-case optimization as a resolution to handle the nonparametric uncertainty. We explain how this approach bypasses the bias-variance tradeoff in EVT. On the other hand, we face a conservativeness-variance tradeoff which we describe how to tackle. We also demonstrate computational tools for the involved optimization problems and compare our performance with conventional EVT across a range of numerical examples.
|
66 |
Modeling and Inference for Multivariate Time Series, with Applications to Integer-Valued Processes and Nonstationary Extreme DataGuerrero, Matheus B. 04 1900 (has links)
This dissertation proposes new statistical methods for modeling and inference for two specific types of time series: integer-valued data and multivariate nonstationary extreme data. We rely on the class of integer-valued autoregressive (INAR) processes for the former, proposing a novel, flexible and elegant way of modeling count phenomena. As for the latter, we are interested in the human brain and its multi-channel electroencephalogram (EEG) recordings, a natural source of extreme events. Thus, we develop new extreme value theory methods for analyzing such data, whether in modeling the conditional extremal dependence for brain connectivity or clustering extreme brain communities of EEG channels. Regarding integer-valued time series, INAR processes are generally defined by specifying the thinning operator and either the innovations or the marginal distributions. The major limitations of such processes include difficulties deriving the marginal properties and justifying the choice of the thinning operator. To overcome these drawbacks, this dissertation proposes a novel approach for building an INAR model that offers the flexibility to prespecify both marginal and innovation distributions. Thus, the thinning operator is no longer subjectively selected but is rather a direct consequence of the marginal and innovation distributions specified by the modeler. Novel INAR processes are introduced following this perspective; these processes include a model with geometric marginal and innovation distributions (Geo-INAR) and models with bounded innovations. We explore the Geo-INAR model, which is a natural alternative to the classical Poisson INAR model. The Geo-INAR process has interesting stochastic properties, such as MA($\infty$) representation, time reversibility, and closed forms for the $h$-th-order transition probabilities, which enables a natural framework to perform coherent forecasting. In the front of multivariate nonstationary extreme data, the focus lies on multi-channel epilepsy data. Epilepsy is a chronic neurological disorder affecting more than 50 million people globally. An epileptic seizure acts like a temporary shock to the neuronal system, disrupting normal electrical activity in the brain. Epilepsy is frequently diagnosed with EEGs. Current statistical approaches for analyzing EEGs use spectral and coherence analysis, which do not focus on extreme behavior in EEGs (such as bursts in amplitude), neglecting that neuronal oscillations exhibit non-Gaussian heavy-tailed probability distributions. To overcome this limitation, this dissertation proposes new approaches to characterize brain connectivity based on extremal features of EEG signals. Two extreme-valued methods to study alterations in the brain network are proposed. One method is Conex-Connect, a pioneering approach linking the extreme amplitudes of a reference EEG channel with the other channels in the brain network. The other method is Club Exco, which clusters multi-channel EEG data based on a spherical $k$-means procedure applied to the "pseudo-angles," derived from extreme amplitudes of EEG signals. Both methods provide new insights into how the brain network organizes itself during an extreme event, such as an epileptic seizure, in contrast to a baseline state.
|
67 |
Velocity Fluctuations and Extreme Events in Microscopic Traffic DataPiepel, Moritz 06 December 2022 (has links)
Vehicle velocity distributions are of utmost relevance for the efficiency, safety, and sustainability of road traffic. Yet, due to technical limitations, they are often empirically analyzed using spatiotemporal averages. Here, we instead study a novel set of microscopic traffic data from Dresden comprising 346 million data points with a resolution of one vehicle from 145 detector sites with a particular focus on extreme events and distribution tails. By fitting q-exponential and Generalized Extreme Value distributions to the right flank of the empirical velocity distributions, we establish that their tails universally exhibit a power-law behavior with similar decay exponents. We also find that q-exponentials are best suitable to model the vast extent to which speed limit violations in the data occur. Furthermore, combining velocity and time headway distributions, we obtain estimates for free flow velocities that always exceed average velocities and sometimes even significantly exceed speed limits. Likewise, congestion effects are found to play a very minor, almost negligible role in traffic flow at the detector sites. These results provide insights into the current state of traffic in Dresden, hinting toward potentially necessary policy amendments regarding road design, speed limits, and speeding prosecution. They also reveal the potentials and limitations of the data set at hand and thereby lay the groundwork for further, more detailed traffic analyses.
|
68 |
Computational Simulation and Machine Learning for Quality Improvement in Composites AssemblyLutz, Oliver Tim 22 August 2023 (has links)
In applications spanning across aerospace, marine, automotive, energy, and space travel domains, composite materials have become ubiquitous because of their superior stiffness-to-weight ratios as well as corrosion and fatigue resistance. However, from a manufacturing perspective, these advanced materials have introduced new challenges that demand the development of new tools. Due to the complex anisotropic and nonlinear material properties, composite materials are more difficult to model than conventional materials such as metals and plastics. Furthermore, there exist ultra-high precision requirements in safety critical applications that are yet to be reliably met in production. Towards developing new tools addressing these challenges, this dissertation aims to (i) build high-fidelity numerical simulations of composite assembly processes, (ii) bridge these simulations to machine learning tools, and (iii) apply data-driven solutions to process control problems while identifying and overcoming their shortcomings. This is accomplished in case studies that model the fixturing, shape control, and fastening of composite fuselage components. Therein, simulation environments are created that interact with novel implementations of modified proximal policy optimization, based on a newly developed reinforcement learning algorithm. The resulting reinforcement learning agents are able to successfully address the underlying optimization problems that underpin the process and quality requirements. / Doctor of Philosophy / Within the manufacturing domain, there has been a concerted effort to transition towards Industry 4.0. To a large degree, this term refers Klaus Schwab's vision presented at the World Economic Forum in 2015, in which he outlined fundamental systemic changes that would incorporate ubiquitous computing, artificial intelligence (AI), big data, and the internet-of-things (IoT) into all aspects of productive activities within the economy. Schwab argues that rapid change will be driven by fusing these new technologies in existing and emerging applications. However, this process has only just begun and there still exist many challenges to realize the promise of Industry 4.0. One such challenge is to create computer models that are not only useful during early design stages of a product, but that are connected to its manufacturing processes, thereby guiding and informing decisions in real-time. This dissertation explores such scenarios in the context of composite structure assembly in aerospace manufacturing. It aims to link computer simulations that characterize the assembly of product components with their physical counterparts, and provides data-driven solutions to control problems that cannot typically be solved without tedious trial-and-error approaches or expert knowledge.
|
69 |
Wireless Network Dimensioning and Provisioning for Ultra-reliable Communication: Modeling and AnalysisGomes Santos Goncalves, Andre Vinicius 28 November 2023 (has links)
A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services such as ultra-reliable low-latency communication (URLLC) and hyper-reliable low-latency communication (HRLLC), the staple mission-critical services in IMT-2020 (5G) and IMT-2023 (6G), for which reliable and resilient communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. A natural way of increasing reliability and reducing latency is to provision additional network resources to compensate for uncertainty in wireless networks caused by fading, interference, mobility, and time-varying network load, among others. Thus, an important step to enable mission-critical services is to identify and quantify what it takes to support ultra-reliable communication in mobile networks -- a process often referred to as dimensioning. This dissertation focuses on resource dimensioning, notably spectrum, for ultra-reliable wireless communication. This dissertation proposes a set of methods for spectrum dimensioning based on concepts from risk analysis, extreme value theory, and meta distributions. These methods reveal that each ``nine'' in reliability (e.g., five-nines in 99.999%) roughly translates into an order of magnitude increase in the required bandwidth. In ultra-reliability regimes, the required bandwidth can be in the order of tens of gigahertz, far beyond what is typically available in today's networks, making it challenging to provision resources for ultra-reliable communication. Accordingly, this dissertation also investigates alternative approaches to provide resources to enable ultra-reliable communication services in mobile networks. Particularly, this dissertation considers multi-operator network sharing and multi-connectivity as alternatives to make additional network resources available to enhance network reliability and proposes multi-operator connectivity sharing, which combines multi-operator network sharing with multi-connectivity. Our studies, based on simulations, real-world data analysis, and mathematical models, suggest that multi-operator connectivity sharing -- in which mobiles multi-connect to base stations of operators in a sharing arrangement -- can reduce the required bandwidth significantly because underlying operators tend to exhibit characteristics attractive to reliability, such as complementary coverage during periods of impaired connectivity, facilitating the support for ultra-reliable communication in future mobile networks. / Doctor of Philosophy / A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services in 5G and 6G, for which ultra-reliable communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. Reliability often comes at the cost of additional network resources to compensate for uncertainty in wireless networks. Thus, an important step to enable ultra-reliable communication is to identify and quantify what it takes to support mission-critical services in mobile networks -- a process often denoted as dimensioning. This dissertation focuses on spectrum dimensioning and proposes a set of methods to identify suitable spectrum bands and required bandwidth for ultra-reliable communication.
These methods reveal that the spectrum needs for ultra-reliable communication can be beyond what is typically available in today's networks, making it challenging to provide adequate resources to support ultra-reliable communication services in mobile networks. Alternatively, we propose multi-operator connectivity sharing: mobiles simultaneously connect to multiple base stations of different operators. Our studies suggest that multi-operator connectivity sharing can reduce the spectrum needs in ultra-reliability regimes significantly, being an attractive alternative to enable ultra-reliable communication in future mobile networks.
|
70 |
ESTIMATING PEAKING FACTORS WITH POISSON RECTANGULAR PULSE MODEL AND EXTREME VALUE THEORYZHANG, XIAOYI 27 September 2005 (has links)
No description available.
|
Page generated in 0.047 seconds