• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 27
  • 19
  • 13
  • 11
  • 9
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 263
  • 263
  • 175
  • 68
  • 61
  • 51
  • 40
  • 34
  • 31
  • 30
  • 28
  • 25
  • 25
  • 23
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

[pt] A TEORIA DOS VALORES EXTREMOS: UMA ABORDAGEM CONDICIONAL PARA A ESTIMAÇÃO DE VALOR EM RISCO NO MERCADO ACIONÁRIO BRASILEIRO / [en] EXTREME VALUE THEORY: A CONDITIONAL APPROACH FOR VALUE AT RISK ESTIMATION IN THE BRAZILIAN STOCK MARKET

FLAVIA COUTINHO MARTINS 03 November 2009 (has links)
[pt] Um dos fatos estilizados mais pronunciados acerca das distribuições de retornos financeiros diz respeito à presença de caudas pesadas. Isso torna os modelos paramétricos tradicionais de cálculo de Valor em Risco (VaR) inadequados para a estimulação de VaR de baixas probabilidades (1% ou menos), dado que estes se baseiam na hipótese de normalidade para as distribuições dos retornos. Tais modelos não são capazes de inferir sobre as reais possibilidades de ocorrência retornos atípicos. Sendo assim, o objetivo de presente trabalho é investigar o desempenho de modelos baseados na Teoria dos Valores Extremos para cálculos de VaR, comparando-os com modelos tradicionais. Um modelo incondicional, proposto a caracterizar o comportamento de longo prazo da série, e um modelo condicional, sugerido por McNeil e Frey (1999), proposto a caracterizar a dependência presente na variância condicional dos retornos foram utilizados e testados em quatro séries de retornos de ações representativas do mercado brasileiro: retornos de Ibovespa, retornos de Ibovespa Futuro, retornos das ações da Telesp e retornos das ações da Petrobrás. Os resultados indicam que os modelos baseados na Teoria dos Valores Extremos são mais adequados para a modelagem das caudas, e conseqüente para a estimulação de Valor em Risco quando os níveis de probabilidade de interesse são baixos. Além disso, modelo condicional é mais adequado em épocas de crise, pois, ao contrário do modelo incondicional, tem a capacidade de responder rapidamente a mudanças na volatilidade. Medidas de risco, como a perda média e a perda mediana também foram propostas, a fim de fornecer estimativas para as perdas no caso do VaR ser violado. / [en] The existence of fat tails is one of the striking stylized facts of financial returns distribution. This fact makes the use of traditional parametric models for Value at Risk (VaR) stimulation unsuitable for the estimation of low probability events (1% or less). This is because traditional models are based on the conditional normality assumption for financial returns distributions, making them unsuitable to predict the actual probabilities of occurrence of atypical returns. The main purpose of this dissertation is to investigate the performance of VaR models based on extreme Value Theory (EVT), and to compare them to some traditional models. Two classes of models are investigated. The first class is based in an unconditional model, which characterizes the long-term behavior of the time series of returns. The other class of models is a conditional one, which incorporates the short-term behavior of the return series, characterized by the strong dependency observed on the conditional variance of the returns. Both models were applied to four representative time series of the Brazilian stock market: The Bovespa Index, Future of Bovespa Index, Telesp stocks and Petrobrás stocks. The results indicates that EVT based models are suitable for low probability VaR stimulation estimation. Besides that, its possible to conclude that the conditional model is more appropriate for crisis periods, because of its capacity to quickly respond to volatily changes. Alternative risk measures are also used, to give estimates losses magnitudes in the case of VaR violation.
112

Computational Simulation and Machine Learning for Quality Improvement in Composites Assembly

Lutz, Oliver Tim 22 August 2023 (has links)
In applications spanning across aerospace, marine, automotive, energy, and space travel domains, composite materials have become ubiquitous because of their superior stiffness-to-weight ratios as well as corrosion and fatigue resistance. However, from a manufacturing perspective, these advanced materials have introduced new challenges that demand the development of new tools. Due to the complex anisotropic and nonlinear material properties, composite materials are more difficult to model than conventional materials such as metals and plastics. Furthermore, there exist ultra-high precision requirements in safety critical applications that are yet to be reliably met in production. Towards developing new tools addressing these challenges, this dissertation aims to (i) build high-fidelity numerical simulations of composite assembly processes, (ii) bridge these simulations to machine learning tools, and (iii) apply data-driven solutions to process control problems while identifying and overcoming their shortcomings. This is accomplished in case studies that model the fixturing, shape control, and fastening of composite fuselage components. Therein, simulation environments are created that interact with novel implementations of modified proximal policy optimization, based on a newly developed reinforcement learning algorithm. The resulting reinforcement learning agents are able to successfully address the underlying optimization problems that underpin the process and quality requirements. / Doctor of Philosophy / Within the manufacturing domain, there has been a concerted effort to transition towards Industry 4.0. To a large degree, this term refers Klaus Schwab's vision presented at the World Economic Forum in 2015, in which he outlined fundamental systemic changes that would incorporate ubiquitous computing, artificial intelligence (AI), big data, and the internet-of-things (IoT) into all aspects of productive activities within the economy. Schwab argues that rapid change will be driven by fusing these new technologies in existing and emerging applications. However, this process has only just begun and there still exist many challenges to realize the promise of Industry 4.0. One such challenge is to create computer models that are not only useful during early design stages of a product, but that are connected to its manufacturing processes, thereby guiding and informing decisions in real-time. This dissertation explores such scenarios in the context of composite structure assembly in aerospace manufacturing. It aims to link computer simulations that characterize the assembly of product components with their physical counterparts, and provides data-driven solutions to control problems that cannot typically be solved without tedious trial-and-error approaches or expert knowledge.
113

Wireless Network Dimensioning and Provisioning for Ultra-reliable Communication: Modeling and Analysis

Gomes Santos Goncalves, Andre Vinicius 28 November 2023 (has links)
A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services such as ultra-reliable low-latency communication (URLLC) and hyper-reliable low-latency communication (HRLLC), the staple mission-critical services in IMT-2020 (5G) and IMT-2023 (6G), for which reliable and resilient communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. A natural way of increasing reliability and reducing latency is to provision additional network resources to compensate for uncertainty in wireless networks caused by fading, interference, mobility, and time-varying network load, among others. Thus, an important step to enable mission-critical services is to identify and quantify what it takes to support ultra-reliable communication in mobile networks -- a process often referred to as dimensioning. This dissertation focuses on resource dimensioning, notably spectrum, for ultra-reliable wireless communication. This dissertation proposes a set of methods for spectrum dimensioning based on concepts from risk analysis, extreme value theory, and meta distributions. These methods reveal that each ``nine'' in reliability (e.g., five-nines in 99.999%) roughly translates into an order of magnitude increase in the required bandwidth. In ultra-reliability regimes, the required bandwidth can be in the order of tens of gigahertz, far beyond what is typically available in today's networks, making it challenging to provision resources for ultra-reliable communication. Accordingly, this dissertation also investigates alternative approaches to provide resources to enable ultra-reliable communication services in mobile networks. Particularly, this dissertation considers multi-operator network sharing and multi-connectivity as alternatives to make additional network resources available to enhance network reliability and proposes multi-operator connectivity sharing, which combines multi-operator network sharing with multi-connectivity. Our studies, based on simulations, real-world data analysis, and mathematical models, suggest that multi-operator connectivity sharing -- in which mobiles multi-connect to base stations of operators in a sharing arrangement -- can reduce the required bandwidth significantly because underlying operators tend to exhibit characteristics attractive to reliability, such as complementary coverage during periods of impaired connectivity, facilitating the support for ultra-reliable communication in future mobile networks. / Doctor of Philosophy / A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services in 5G and 6G, for which ultra-reliable communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. Reliability often comes at the cost of additional network resources to compensate for uncertainty in wireless networks. Thus, an important step to enable ultra-reliable communication is to identify and quantify what it takes to support mission-critical services in mobile networks -- a process often denoted as dimensioning. This dissertation focuses on spectrum dimensioning and proposes a set of methods to identify suitable spectrum bands and required bandwidth for ultra-reliable communication. These methods reveal that the spectrum needs for ultra-reliable communication can be beyond what is typically available in today's networks, making it challenging to provide adequate resources to support ultra-reliable communication services in mobile networks. Alternatively, we propose multi-operator connectivity sharing: mobiles simultaneously connect to multiple base stations of different operators. Our studies suggest that multi-operator connectivity sharing can reduce the spectrum needs in ultra-reliability regimes significantly, being an attractive alternative to enable ultra-reliable communication in future mobile networks.
114

ESTIMATING PEAKING FACTORS WITH POISSON RECTANGULAR PULSE MODEL AND EXTREME VALUE THEORY

ZHANG, XIAOYI 27 September 2005 (has links)
No description available.
115

Actuarial modelling of extremal events using transformed generalized extreme value distributions and generalized pareto distributions

Han, Zhongxian 14 October 2003 (has links)
No description available.
116

Brown-Resnick Processes: Analysis, Inference and Generalizations

Engelke, Sebastian 14 December 2012 (has links)
No description available.
117

Monte Carlo Simulation Based Response Estimation and Model Updating in Nonlinear Random Vibrations

Radhika, Bayya January 2012 (has links) (PDF)
The study of randomly excited nonlinear dynamical systems forms the focus of this thesis. We discuss two classes of problems: first, the characterization of nonlinear random response of the system before it comes into existence and, the second, assimilation of measured responses into the mathematical model of the system after the system comes into existence. The first class of problems constitutes forward problems while the latter belongs to the class of inverse problems. An outstanding feature of these problems is that they are almost always not amenable for exact solutions. We tackle in the present study these two classes of problems using Monte Carlo simulation tools in conjunction with Markov process theory, Bayesian model updating strategies, and particle filtering based dynamic state estimation methods. It is well recognized in literature that any successful application of Monte Carlo simulation methods to practical problems requires the simulation methods to be reinforced with effective means of controlling sampling variance. This can be achieved by incorporating any problem specific qualitative and (or) quantitative information that one might have about system behavior in formulating estimators for response quantities of interest. In the present thesis we outline two such approaches for variance reduction. The first of these approaches employs a substructuring scheme, which partitions the system states into two sets such that the probability distribution of the states in one of the sets conditioned on the other set become amenable for exact analytical solution. In the second approach, results from data based asymptotic extreme value analysis are employed to tackle problems of time variant reliability analysis and updating of this reliability. We exemplify in this thesis the proposed approaches for response estimation and model updating by considering wide ranging problems of interest in structural engineering, namely, nonlinear response and reliability analyses under stationary and (or) nonstationary random excitations, response sensitivity model updating, force identification, residual displacement analysis in instrumented inelastic structures under transient excitations, problems of dynamic state estimation in systems with local nonlinearities, and time variant reliability analysis and reliability model updating. We have organized the thesis into eight chapters and three appendices. A resume of contents of these chapters and appendices follows. In the first chapter we aim to provide an overview of mathematical tools which form the basis for investigations reported in the thesis. The starting point of the study is taken to be a set of coupled stochastic differential equations, which are obtained after discretizing spatial variables, typically, based on application of finite element methods. Accordingly, we provide a summary of the following topics: (a) Markov vector approach for characterizing time evolution of transition probability density functions, which includes the forward and backward Kolmogorov equations, (b) the equations governing the time evolution of response moments and first passage times, (c) numerical discretization of governing stochastic differential equation using Ito-Taylor’s expansion, (d) the partial differential equation governing the time evolution of transition probability density functions conditioned on measurements for the study of existing instrumented structures, (e) the time evolution of response moments conditioned on measurements based on governing equations in (d), and (f) functional recursions for evolution of multidimensional posterior probability density function and posterior filtering density function, when the time variable is also discretized. The objective of the description here is to provide an outline of the theoretical formulations within which the problems of response estimation and model updating are formulated in the subsequent chapters of the present thesis. We briefly state the class of problems, which are amenable for exact solutions. We also list in this chapter major text books, research monographs, and review papers relevant to the topics of nonlinear random vibration analysis and dynamic state estimation. In Chapter 2 we provide a review of literature on solutions of problems of response analysis and model updating in nonlinear dynamical systems. The main focus of the review is on Monte Carlo simulation based methods for tackling these problems. The review accordingly covers numerical methods for approximate solutions of Kolmogorov equations and associated moment equations, variance reduction in simulation based analysis of Markovian systems, dynamic state estimation methods based on Kalman filter and its variants, particle filtering, and variance reduction based on Rao-Blackwellization. In this review we chiefly cover papers that have contributed to the growth of the methodology. We also cover briefly, the efforts made in applying the ideas to structural engineering problems. Based on this review, we identify the problems of variance reduction using substructuring schemes and data based extreme value analysis and, their incorporation into response estimation and model updating strategies, as problems requiring further research attention. We also identify a range of problems where these tools could be applied. We consider the development of a sequential Monte Carlo scheme, which incorporates a substructuring strategy, for the analysis of nonlinear dynamical systems under random excitations in Chapter 3. The proposed substructuring ensures that a part of the system states conditioned on the remaining states becomes Gaussian distributed and is amenable for an exact analytical solution. The use of Monte Carlo simulations is subsequently limited for the analysis of the remaining system states. This clearly results in reduction in sampling variance since a part of the problem is tackled analytically in an exact manner. The successful performance of the proposed approach is illustrated by considering response analysis of a single degree of freedom nonlinear oscillator under random excitations. Arguments based on variance decomposition result and Rao-Blackwell theorems are presented to demonstrate that the proposed variance reduction indeed is effective. In Chapter 4, we modify the sequential Monte Carlo simulation strategy outlined in the preceding chapter to incorporate questions of dynamic state estimation when data on measured responses become available. Here too, the system states are partitioned into two groups such that the states in one group become Gaussian distributed when conditioned on the states in the other group. The conditioned Gaussian states are subsequently analyzed exactly using the Kalman filter and, this is interfaced with the analysis of the remaining states using sequential importance sampling based filtering strategy. The development of this combined Kalman and sequential importance sampling filtering method constitutes one of the novel elements of this study. The proposed strategy is validated by considering the problem of dynamic state estimation in linear single and multi-degree of freedom systems for which exact analytical solutions exist. In Chapter 5, we consider the application of the tools developed in Chapter 4 for a class of wide ranging problems in nonlinear random vibrations of existing systems. The nonlinear systems considered include single and multi-degree of freedom systems, systems with memoryless and hereditary nonlinearities, and stationary and nonstationary random excitations. The specific applications considered include nonlinear dynamic state estimation in systems with local nonlinearities, estimation of residual displacement in instrumented inelastic dynamical system under transient random excitations, response sensitivity model updating, and identification of transient seismic base motions based on measured responses in inelastic systems. Comparisons of solutions from the proposed substructuring scheme with corresponding results from direct application of particle filtering are made and a satisfactory mutual agreement is demonstrated. We consider next questions on time variant reliability analysis and corresponding model updating in Chapters 6 and 7, respectively. The research effort in these studies is focused on exploring the application of data based asymptotic extreme value analysis for problems on hand. Accordingly, we investigate reliability of nonlinear vibrating systems under stochastic excitations in Chapter 6 using a two-stage Monte Carlo simulation strategy. For systems with white noise excitation, the governing equations of motion are interpreted as a set of Ito stochastic differential equations. It is assumed that the probability distribution of the maximum over a specified time duration in the steady state response belongs to the basin of attraction of one of the classical asymptotic extreme value distributions. The first stage of the solution strategy consists of selection of the form of the extreme value distribution based on hypothesis testing, and, the next stage involves the estimation of parameters of the relevant extreme value distribution. Both these stages are implemented using data from limited Monte Carlo simulations of the system response. The proposed procedure is illustrated with examples of linear/nonlinear systems with single/multiple degrees of freedom driven by random excitations. The predictions from the proposed method are compared with the results from large scale Monte Carlo simulations, and also with the classical analytical results, when available, from the theory of out-crossing statistics. Applications of the proposed method for vibration data obtained from laboratory conditions are also discussed. In Chapter 7 we consider the problem of time variant reliability analysis of existing structures subjected to stationary random dynamic excitations. Here we assume that samples of dynamic response of the structure, under the action of external excitations, have been measured at a set of sparse points on the structure. The utilization of these measurements in updating reliability models, postulated prior to making any measurements, is considered. This is achieved by using dynamic state estimation methods which combine results from Markov process theory and Bayes’ theorem. The uncertainties present in measurements as well as in the postulated model for the structural behaviour are accounted for. The samples of external excitations are taken to emanate from known stochastic models and allowance is made for ability (or lack of it) to measure the applied excitations. The future reliability of the structure is modeled using expected structural response conditioned on all the measurements made. This expected response is shown to have a time varying mean and a random component that can be treated as being weakly stationary. For linear systems, an approximate analytical solution for the problem of reliability model updating is obtained by combining theories of discrete Kalman filter and level crossing statistics. For the case of nonlinear systems, the problem is tackled by combining particle filtering strategies with data based extreme value analysis. The possibility of using conditional simulation strategies, when applied external actions are measured, is also considered. The proposed procedures are exemplified by considering the reliability analysis of a few low dimensional dynamical systems based on synthetically generated measurement data. The performance of the procedures developed is also assessed based on limited amount of pertinent Monte Carlo simulations. A summary of the contributions made and a few suggestions for future work are presented in Chapter 8. The thesis also contains three appendices. Appendix A provides details of the order 1.5 strong Taylor scheme that is extensively employed at several places in the thesis. The formulary pertaining to the bootstrap and sequential importance sampling particle filters is provided in Appendix B. Some of the results on characterizing conditional probability density functions that have been used in the development of the combined Kalman and sequential importance sampling filter in Chapter 4 are elaborated in Appendix C.
118

Modeling Extreme Values / Modelování extrémních hodnot

Shykhmanter, Dmytro January 2013 (has links)
Modeling of extreme events is a challenging statistical task. Firstly, there is always a limit number of observations and secondly therefore no experience to back test the result. One way of estimating higher quantiles is to fit one of theoretical distributions to the data and extrapolate to the tail. The shortcoming of this approach is that the estimate of the tail is based on the observations in the center of distribution. Alternative approach to this problem is based on idea to split the data into two sub-populations and model body of the distribution separately from the tail. This methodology is applied to non-life insurance losses, where extremes are particularly important for risk management. Never the less, even this approach is not a conclusive solution of heavy tail modeling. In either case, estimated 99.5% percentiles have such high standard errors, that the their reliability is very low. On the other hand this approach is theoretically valid and deserves to be considered as one of the possible methods of extreme value analysis.
119

Rozdělení extrémních hodnot a jejich aplikace / Extreme Value Distributions with Applications

Fusek, Michal January 2013 (has links)
The thesis is focused on extreme value distributions and their applications. Firstly, basics of the extreme value theory for one-dimensional observations are summarized. Using the limit theorem for distribution of maximum, three extreme value distributions (Gumbel, Fréchet, Weibull) are introduced and their domains of attraction are described. Two models for parametric functions estimation based on the generalized extreme value distribution (block maxima model) and the generalized Pareto distribution (threshold model) are introduced. Parameters estimates of these distributions are derived using the method of maximum likelihood and the probability weighted moment method. Described methods are used for analysis of the rainfall data in the Brno Region. Further attention is paid to Gumbel class of distributions, which is frequently used in practice. Methods for statistical inference of multiply left-censored samples from exponential and Weibull distribution considering the type I censoring are developed and subsequently used in the analysis of synthetic musk compounds concentrations. The last part of the thesis deals with the extreme value theory for two-dimensional observations. Demonstrational software for the extreme value distributions was developed as a part of this thesis.
120

Updating Rainfall Intensity-Duration-Frequency Curves in Sweden Accounting for the Observed Increase in Rainfall Extremes / Uppdatering av Intensitets-Varaktighetskurvor i Sverige med hänsyn till observera- de ökande trender av extrem nederbörd

Eckersten, Sofia January 2016 (has links)
Increased extreme precipitation has been documented in many regions around the world, in- cluding central and northern Europe. Global warming increases average temperature, which in turn enhances atmospheric water holding capacity. These changes are believed to increase the frequency and/or intensity of extreme precipitation events. In determining the design storm, or a worst probable storm, for infrastructure design and failure risk assessment, experts commonly assume that statistics of extreme precipitation do not change significantly over time. This so- called notion of stationarity assumes that the statistics of future extreme precipitation events will be similar to those of historical observations. This study investigates the consequences of using a stationary assumption as well as the alternative: a non-stationary framework that con- siders temporal changes in statistics of extremes. Here we evaluate stationary and non-stationary return levels for 10-year to 50-year extreme precipitation events for different durations (1-day, 2-day, ..., 7-day precipitation events), based on the observed daily precipitation from Sweden. Non-stationary frequency analysis is only considered for stations with statistically significant trends over the past 50 years at 95% confidence (i.e., 15 to 39 % out of 139 stations, depend- ing on duration, 1-day, 2-day, ..., 7-day). We estimate non-stationary return levels using the General Extreme Value distribution with time-dependent parameters, inferred using a Bayesian approach. The estimated return levels are then compared in terms of duration, recurrence in- terval and location. The results indicate that a stationary assumption might, when a significant trend exists, underestimate extreme precipitation return levels by up to 40 % in Sweden. This report highlights the importance of considering better methods for estimating the recurrence in- terval of extreme events in a changing climate. This is particularly important for infrastructure design and risk reduction. / Ökad extrem nederbörd har dokumenterats globalt, däribland centrala och norra Europa. Den globala uppvärmningen medför en förhöjd medeltemperatur vilket i sin tur ökar avdunstning av vatten från ytor samt atmosfärens förmåga att hålla vatten. Dessa förändringar tros kunna öka och intensifiera nederbörd. Vid bestämning av dimensionerande nederbördsintensiteter för byggnationsprojekt antas idag att frekvensen och storleken av extrem nederbörd inte kommer att förändras i framtiden (stationäritet), vilket i praktiken innebär ingen förändring i klimatet. Den här studien syftar till att undersöka effekten av en icke-stationärt antagande vid skattning av dimensionerande nederbördsintensitet. Icke-stationära och stationära nerderbördsintensiteter föråterkomsttider mellan 10 och 100år bestämdes utifrån daglig och flerdaglig svensk nederbörds- data. Nederbördintensiteterna bestämdes med extremvärdesanalys i mjukvaran NEVA, där den generella extremvärdesfördelningen anpassades till årlig maximum nederbörd på platser i Sverige som påvisade en ökande trend under de senaste 50åren (15% till 39 % utav 139 stationer, beroende på varaktighet). De dimensionerande nederbördsintensiteterna jämfördes sedan med avseende på varaktighet, återkomsttid och plats. Resultaten indikerade på att ett stationärt antagande riskerar att underskatta dimensionerande nederbördsintensiteter för en viss återkomsttid med upp till 40 %. Detta indikerar att antagandet om icke-stationäritet har större betydelse för olika platser i Sverige, vilket skulle kunna ge viktig information vid bestämning av dimensionerande regnintensiteter.

Page generated in 0.1072 seconds