• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 100
  • 14
  • 13
  • 7
  • 6
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 168
  • 168
  • 168
  • 54
  • 33
  • 30
  • 25
  • 24
  • 21
  • 20
  • 20
  • 19
  • 18
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Kvantifikace vícerozměrných rizik / Quantification of multivariate risk

Hilbert, Hynek January 2013 (has links)
In the present work we study multivariate extreme value theory. Our main focus is on exceedances over linear thresholds. Smaller part is devoted to exce- edances over elliptical thresholds. We consider extreme values as those which belong to remote regions and investigate convergence of their distribution to the limit distribution. The regions are either halfspaces or ellipsoids. Working with halfspaces we distinguish between two setups: we either assume that the distribution of extreme values is directionally homogeneous and we let the halfspaces diverge in any direction, or we assume that there are some irre- gularities in the sample cloud which show us the fixed direction we should let the halfspaces drift out. In the first case there are three limit laws. The domains of attraction contain unimodal and rotund-exponential distributions. In the second case there exist a lot of limit laws without general form. The domains of attraction also fail to have common structure. The similar situation occurs for the exceedances over elliptical thresholds. The task here is to investigate convergence of the random vectors living in the complements of ellipsoids. For all, the limit distributions are determined by affine transformations and distribution of spectral measure. 1
92

Modelování operačního rizika / Operational risk modelling

Mináriková, Eva January 2013 (has links)
In the present thesis we will firstly familiarize ourselves with the term of operational risk, it's definition presented in the directives Basel II and Solvency II, and afterwards with the methods of calculation Capital Requirements for Operational Risk, set by these directives. In the second part of the thesis we will concentrate on the methods of modelling operational loss data. We will introduce the Extreme Value Theory which describes possible approaches to modelling data with significant values that occur infrequently; the typical characteristic of operational risk data. We will mainly focus on the model for threshold exceedances which utilizes Generalized Pareto Distribution to model the distribution of those excesses. The teoretical knowledge of this theory and the appropriate modelling will be applied on simulated loss data. Finally we will test the ability of presented methods to model loss data distributions.
93

Étude de l'application de la théorie des valeurs extrêmes pour l'estimation fiable et robuste du pire temps d'exécution probabiliste / Study of the extreme value theory applicability for reliable and robust probabilistic worst-case execution time estimates

Guet, Fabrice 13 December 2017 (has links)
Dans les systèmes informatiques temps réel, les tâches logicielles sont contraintes par le temps. Pour garantir la sûreté du système critique contrôlé par le système temps réel, il est primordial d'estimer de manière sûre le pire temps d'exécution de chaque tâche. Les performances des processeurs actuels du commerce permettent de réduire en moyenne le temps d'exécution des tâches, mais la complexité des composants d'optimisation de la plateforme rendent difficile l'estimation du pire temps d'exécution. Il existe différentes approches d'estimation du pire temps d'exécution, souvent ségréguées et difficilement généralisables ou au prix de modèles coûteux. Les approches probabilistes basées mesures existantes sont vues comme étant rapides et simples à mettre en œuvre, mais souffrent d'un manque de systématisme et de confiance dans les estimations qu'elles fournissent. Les travaux de cette thèse étudient les conditions d'application de la théorie des valeurs extrêmes à une suite de mesures de temps d'exécution pour l'estimation du pire temps d'exécution probabiliste, et ont été implémentées dans l'outil diagxtrm. Les capacités et les limites de l'outil ont été étudiées grâce à diverses suites de mesures issues de systèmes temps réel différents. Enfin, des méthodes sont proposées pour déterminer les conditions de mesure propices à l'application de la théorie des valeurs extrêmes et donner davantage de confiance dans les estimations. / Software tasks are time constrained in real time computing systems. To ensure the safety of the critical systems that embeds the real time system, it is of paramount importance to safely estimate the worst-case execution time of each task. Modern commercial processors optimisation components enable to reduce in average the task execution time at the cost of a hard to determine task worst-case execution time. Many approaches for executing a task worst-case execution time exist but are usually segregated and hardly scalable, or by building very complex models. Measurement-based probabilistic timing analysis approaches are said to be easy and fast, but they suffer from a lack of systematism and confidence in their estimates. This thesis studies the applicability of the extreme value theory to a sequence of execution time measurements for the estimation of the probabilistic worst-case execution time, leading to the development of the diagxtrm tool. Thanks to a large panel of sequences of measurements from different real time systems, capabilities and limits of the tool are enlightened. Finally, a couple of methods are provided for determining measurements conditions that foster the application of the theory and raise more confidence in the estimates.
94

Modelling heavy rainfall over time and space

Khuluse, Sibusisiwe Audrey 06 June 2011 (has links)
Extreme Value Theory nds application in problems concerning low probability but high consequence events. In hydrology the study of heavy rainfall is important in regional ood risk assessment. In particular, the N-year return level is a key output of an extreme value analysis, hence care needs to be taken to ensure that the model is accurate and that the level of imprecision in the parameter estimates is made explicit. Rainfall is a process that evolves over time and space. Therefore, it is anticipated that at extreme levels the process would continue to show temporal and spatial correlation. In this study interest is in whether any trends in heavy rainfall can be detected for the Western Cape. The focus is on obtaining the 50-year daily winter rainfall return level and investigating whether this quantity is homogenous over the study area. The study is carried out in two stages. In the rst stage, the point process approach to extreme value theory is applied to arrive at the return level estimates at each of the fteen sites. Stationarity is assumed for the series at each station, thus an issue to deal with is that of short-range temporal correlation of threshold exceedances. The proportion of exceedances is found to be smaller (approximately 0.01) for stations towards the east such as Jonkersberg, Plettenbergbay and Tygerhoek. This can be attributed to rainfall values being mostly low, with few instances where large amounts of rainfall were observed. Looking at the parameters of the point process extreme value model, the location parameter estimate appears stable over the region in contrast to the scale parameter estimate which shows an increase towards in a south easterly direction. While the model is shown to t exceedances at each station adequately, the degree of uncertainty is large for stations such as Tygerhoek, where the maximum observed rainfall value is approximately twice as large as the high rainfall values. This situation was also observed at other stations and in such cases removal of these high rainfall values was avoided to minimize the risk of obtaining inaccurate return level estimates. The key result is an N-year rainfall return level estimate at each site. Interest is in mapping an estimate of the 50-year daily winter rainfall return level, however to evaluate the adequacy of the model at each site the 25-year return level is considered since a 25 year return period is well within the range of the observed data. The 25-year daily winter rainfall return level estimate for Ladismith is the smallest at 22:42 mm. This can be attributed to the station's generally low observed winter rainfall values. In contrast, the return level estimate for Tygerhoek is high, almost six times larger than that of Ladismith at 119:16 mm. Visually design values show di erences between sites, therefore it is of interest to investigate whether these di erences can be modelled. The second stage is the geostatistical analysis of the 50-year 24-hour rainfall return level The aim here is to quantify the degree of spatial variation in the 50-year 24-hour rainfall return level estimates and to use that association to predict values at unobserved sites within the study region. A tool for quantifying spatial variation is the variogram model. Estimation of the parameters of this model require a su ciently large sample, which is a challenge in this study since there is only fteen stations and therefore only fteen observations for the geostatistical analysis. To address this challenge, observations are expanded in space and time and then standardized and to create a larger pool of data from which the variogram is estimated. The obtained estimates are used in ordinary and universal kriging to derive the 50-year 24-hour winter rainfall return level maps. It is shown that 50-year daily winter design rainfall over most of the Western Cape lies between 40 mm and 80 mm, but rises sharply as one moves towards the east coast of the region. This is largely due to the in uence of large design values obtained for Tygerhoek. In ordinary kriging prediction uncertainty is lowest around observed values and is large if the distance from these points increases. Overall, prediction uncertainty maps show that ordinary kriging performs better than universal kriging where a linear regional trend in design values is included.
95

Value at Risk no mercado financeiro internacional: avaliação da performance dos modelos nos países desenvolvidos e emergentes / Value at Risk in international finance: evaluation of the models performance in developed and emerging countries

Gaio, Luiz Eduardo 01 April 2015 (has links)
Diante das exigências estipuladas pelos órgãos reguladores pelos acordos internacionais, tendo em vistas as inúmeras crises financeiras ocorridas nos últimos séculos, as instituições financeiras desenvolveram diversas ferramentas para a mensuração e controle do risco inerente aos negócios. Apesar da crescente evolução das metodologias de cálculo e mensuração do risco, o Value at Risk (VaR) se tornou referência como ferramenta de estimação do risco de mercado. Nos últimos anos novas técnicas de cálculo do Value at Risk (VaR) vêm sendo desenvolvidas. Porém, nenhuma tem sido considerada como a que melhor ajusta os riscos para diversos mercados e em diferentes momentos. Não existe na literatura um modelo conciso e coerente com as diversidades dos mercados. Assim, o presente trabalho tem por objetivo geral avaliar os estimadores de risco de mercado, gerados pela aplicação de modelos baseados no Value at Risk (VaR), aplicados aos índices das principais bolsas dos países desenvolvidos e emergentes, para os períodos normais e de crise financeira, de modo a apurar os mais efetivos nessa função. Foram considerados no estudo os modelos VaR Não condicional, pelos modelos tradicionais (Simulação Histórica, Delta-Normal e t-Student) e baseados na Teoria de Valores Extremos; o VaR Condicional, comparando os modelos da família ARCH e Riskmetrics e o VaR Multivariado, com os modelos GARCH bivariados (Vech, Bekk e CCC), funções cópulas (t-Student, Clayton, Frank e Gumbel) e por Redes Neurais Artificiais. A base de dados utilizada refere-se as amostras diárias dos retornos dos principais índices de ações dos países desenvolvidos (Alemanha, Estados Unidos, França, Reino Unido e Japão) e emergentes (Brasil, Rússia, Índia, China e África do Sul), no período de 1995 a 2013, contemplando as crises de 1997 e 2008. Os resultados do estudo foram, de certa forma, distintos das premissas iniciais estabelecidas pelas hipóteses de pesquisa. Diante de mais de mil modelagens realizadas, os modelos condicionais foram superiores aos não condicionais, na maioria dos casos. Em específico o modelo GARCH (1,1), tradicional na literatura, teve uma efetividade de ajuste de 93% dos casos. Para a análise Multivariada, não foi possível definir um modelo mais assertivo. Os modelos Vech, Bekk e Cópula - Clayton tiveram desempenho semelhantes, com bons ajustes em 100% dos testes. Diferentemente do que era esperado, não foi possível perceber diferenças significativas entre os ajustes para países desenvolvidos e emergentes e os momentos de crise e normal. O estudo contribuiu na percepção de que os modelos utilizados pelas instituições financeiras não são os que apresentam melhores resultados na estimação dos riscos de mercado, mesmo sendo recomendados pelas instituições renomadas. Cabe uma análise mais profunda sobre o desempenho dos estimadores de riscos, utilizando simulações com as carteiras de cada instituição financeira. / Given the requirements stipulated by regulatory agencies for international agreements, in considering the numerous financial crises in the last centuries, financial institutions have developed several tools to measure and control the risk of the business. Despite the growing evolution of the methodologies of calculation and measurement of Value at Risk (VaR) has become a reference tool as estimate market risk. In recent years new calculation techniques of Value at Risk (VaR) have been developed. However, none has been considered the one that best fits the risks for different markets and in different times. There is no literature in a concise and coherent model with the diversity of markets. Thus, this work has the objective to assess the market risk estimates generated by the application of models based on Value at Risk (VaR), applied to the indices of the major stock exchanges in developed and emerging countries, for normal and crisis periods financial, in order to ascertain the most effective in that role. Were considered in the study models conditional VaR, the conventional models (Historical Simulation, Delta-Normal and Student t test) and based on Extreme Value Theory; Conditional VaR by comparing the models of ARCH family and RiskMetrics and the Multivariate VaR, with bivariate GARCH (VECH, Bekk and CCC), copula functions (Student t, Clayton, Frank and Gumbel) and Artificial Neural Networks. The database used refers to the daily samples of the returns of major stock indexes of developed countries (Germany, USA, France, UK and Japan) and emerging (Brazil, Russia, India, China and South Africa) from 1995 to 2013, covering the crisis in 1997 and 2008. The results were somewhat different from the initial premises established by the research hypotheses. Before more than 1 mil modeling performed, the conditional models were superior to non-contingent, in the majority of cases. In particular the GARCH (1,1) model, traditional literature, had a 93% adjustment effectiveness of cases. For multivariate analysis, it was not possible to set a more assertive style. VECH models, and Bekk, Copula - Clayton had similar performance with good fits to 100% of the tests. Unlike what was expected, it was not possible to see significant differences between the settings for developed and emerging countries and the moments of crisis and normal. The study contributed to the perception that the models used by financial institutions are not the best performing in the estimation of market risk, even if recommended by renowned institutions. It is a deeper analysis on the performance of the estimators of risk, using simulations with the portfolios of each financial institution.
96

Estimation de mesures de risque pour des distributions elliptiques conditionnées / Estimation of risk measures for conditioned elliptical distributions

Usseglio-Carleve, Antoine 26 June 2018 (has links)
Cette thèse s'intéresse à l'estimation de certaines mesures de risque d'une variable aléatoire réelle Y en présence d'une covariable X. Pour cela, on va considérer que le vecteur (X,Y) suit une loi elliptique. Dans un premier temps, on va s'intéresser aux quantiles de Y sachant X=x. On va alors tester d'abord un modèle de régression quantile assez répandu dans la littérature, pour lequel on obtient des résultats théoriques que l'on discutera. Face aux limites d'un tel modèle, en particulier pour des niveaux de quantile dits extrêmes, on proposera une nouvelle approche plus adaptée. Des résultats asymptotiques sont donnés, appuyés par une étude numérique puis par un exemple sur des données réelles. Dans un second chapitre, on s'intéressera à une autre mesure de risque appelée expectile. La structure du chapitre est sensiblement la même que celle du précédent, à savoir le test d'un modèle de régression inadapté aux expectiles extrêmes, pour lesquels on propose une approche méthodologique puis statistique. De plus, en mettant en évidence le lien entre les quantiles et expectiles extrêmes, on s'aperçoit que d'autres mesures de risque extrêmes sont étroitement liées aux quantiles extrêmes. On se concentrera sur deux familles appelées Lp-quantiles et mesures d'Haezendonck-Goovaerts, pour lesquelles on propose des estimateurs extrêmes. Une étude numérique est également fournie. Enfin, le dernier chapitre propose quelques pistes pour traiter le cas où la taille de la covariable X est grande. En constatant que nos estimateurs définis précédemment étaient moins performants dans ce cas, on s'inspire alors de quelques méthodes d'estimation en grande dimension pour proposer d'autres estimateurs. Une étude numérique permet d'avoir un aperçu de leurs performances / This PhD thesis focuses on the estimation of some risk measures for a real random variable Y with a covariate vector X. For that purpose, we will consider that the random vector (X,Y) is elliptically distributed. In a first time, we will deal with the quantiles of Y given X=x. We thus firstly investigate a quantile regression model, widespread in the litterature, for which we get theoretical results that we discuss. Indeed, such a model has some limitations, especially when the quantile level is said extreme. Therefore, we propose another more adapted approach. Asymptotic results are given, illustrated by a simulation study and a real data example.In a second chapter, we focus on another risk measure called expectile. The structure of the chapter is essentially the same as that of the previous one. Indeed, we first use a regression model that is not adapted to extreme expectiles, for which a methodological and statistical approach is proposed. Furthermore, highlighting the link between extreme quantiles and expectiles, we realize that other extreme risk measures are closely related to extreme quantiles. We will focus on two families called Lp-quantiles and Haezendonck-Goovaerts risk measures, for which we propose extreme estimators. A simulation study is also provided. Finally, the last chapter is devoted to the case where the size of the covariate vector X is tall. By noticing that our previous estimators perform poorly in this case, we rely on some high dimensional estimation methods to propose other estimators. A simulation study gives a visual overview of their performances
97

Estimation des limites d'extrapolation par les lois de valeurs extrêmes. Application à des données environnementales / Estimation of extrapolation limits based on extreme-value distributions.Application to environmental data.

Albert, Clément 17 December 2018 (has links)
Cette thèse se place dans le cadre de la Statistique des valeurs extrêmes. Elle y apporte trois contributions principales. L'estimation des quantiles extrêmes se fait dans la littérature en deux étapes. La première étape consiste à utiliser une approximation des quantiles basée sur la théorie des valeurs extrêmes. La deuxième étape consiste à estimer les paramètres inconnus de l'approximation en question, et ce en utilisant les valeurs les plus grandes du jeu de données. Cette décomposition mène à deux erreurs de nature différente, la première étant une erreur systémique de modèle, dite d'approximation ou encore d'extrapolation, la seconde consituant une erreur d'estimation aléatoire. La première contribution de cette thèse est l'étude théorique de cette erreur d'extrapolation mal connue.Cette étude est menée pour deux types d'estimateur différents, tous deux cas particuliers de l'approximation dite de la "loi de Pareto généralisée" : l'estimateur Exponential Tail dédié au domaine d'attraction de Gumbel et l'estimateur de Weissman dédié à celui de Fréchet.Nous montrons alors que l'erreur en question peut s'interpréter comme un reste d'ordre un d'un développement de Taylor. Des conditions nécessaires et suffisantes sont alors établies de telle sorte que l'erreur tende vers zéro quand la taille de l'échantillon augmente. De manière originale, ces conditions mènent à une division du domaine d'attraction de Gumbel en trois parties distinctes. En comparaison, l'erreur d'extrapolation associée à l'estimateur de Weissman présente un comportement unifié sur tout le domaine d'attraction de Fréchet. Des équivalents de l'erreur sont fournis et leur comportement est illustré numériquement. La deuxième contribution est la proposition d'un nouvel estimateur des quantiles extrêmes. Le problème est abordé dans le cadre du modèle ``log Weibull-tail'' généralisé, où le logarithme de l'inverse du taux de hasard cumulé est supposé à variation régulière étendue. Après une discussion sur les conséquences de cette hypothèse, nous proposons un nouvel estimateur des quantiles extrêmes basé sur ce modèle. La normalité asymptotique dudit estimateur est alors établie et son comportement en pratique est évalué sur données réelles et simulées.La troisième contribution de cette thèse est la proposition d'outils permettant en pratique de quantifier les limites d'extrapolation d'un jeu de données. Dans cette optique, nous commençons par proposer des estimateurs des erreurs d'extrapolation associées aux approximations Exponential Tail et Weissman. Après avoir évalué les performances de ces estimateurs sur données simulées, nous estimons les limites d'extrapolation associées à deux jeux de données réelles constitués de mesures journalières de variables environnementales. Dépendant de l'aléa climatique considéré, nous montrons que ces limites sont plus ou moins contraignantes. / This thesis takes place in the extreme value statistics framework. It provides three main contributions to this area. The extreme quantile estimation is a two step approach. First, it consists in proposing an extreme value based quantile approximation. Then, estimators of the unknown quantities are plugged in the previous approximation leading to an extreme quantile estimator.The first contribution of this thesis is the study of this previous approximation error. These investigations are carried out using two different kind of estimators, both based on the well-known Generalized Pareto approximation: the Exponential Tail estimator dedicated to the Gumbel maximum domain of attraction and the Weissman estimator dedicated to the Fréchet one.It is shown that the extrapolation error can be interpreted as the remainder of a first order Taylor expansion. Necessary and sufficient conditions are then provided such that this error tends to zero as the sample size increases. Interestingly, in case of the so-called Exponential Tail estimator, these conditions lead to a subdivision of Gumbel maximum domain of attraction into three subsets. In constrast, the extrapolation error associated with Weissmanestimator has a common behavior over the whole Fréchet maximum domain of attraction. First order equivalents of the extrapolation error are thenderived and their accuracy is illustrated numerically.The second contribution is the proposition of a new extreme quantile estimator.The problem is addressed in the framework of the so-called ``log-Generalized Weibull tail limit'', where the logarithm of the inverse cumulative hazard rate function is supposed to be of extended regular variation. Based on this model, a new estimator of extreme quantiles is proposed. Its asymptotic normality is established and its behavior in practice is illustrated on both real and simulated data.The third contribution of this thesis is the proposition of new mathematical tools allowing the quantification of extrapolation limits associated with a real dataset. To this end, we propose estimators of extrapolation errors associated with the Exponentail Tail and the Weissman approximations. We then study on simulated data how these two estimators perform. We finally use these estimators on real datasets to show that, depending on the climatic phenomena,the extrapolation limits can be more or less stringent.
98

Microstructure-sensitive extreme value probabilities of fatigue in advanced engineering alloys

Przybyla, Craig Paul 07 July 2010 (has links)
A novel microstructure-sensitive extreme value probabilistic framework is introduced to evaluate material performance/variability for damage evolution processes (e.g., fatigue, fracture, creep). This framework employs newly developed extreme value marked correlation functions (EVMCF) to identify the coupled microstructure attributes (e.g., phase/grain size, grain orientation, grain misorientation) that have the greatest statistical relevance to the extreme value response variables (e.g., stress, elastic/plastic strain) that describe the damage evolution processes of interest. This is an improvement on previous approaches that account for distributed extreme value response variables that describe the damage evolution process of interest based only on the extreme value distributions of a single microstructure attribute; previous approaches have given no consideration of how coupled microstructure attributes affect the distributions of extreme value response. This framework also utilizes computational modeling techniques to identify correlations between microstructure attributes that significantly raise or lower the magnitudes of the damage response variables of interest through the simulation of multiple statistical volume elements (SVE). Each SVE for a given response is constructed to be a statistical sample of the entire microstructure ensemble (i.e., bulk material); therefore, the response of interest in each SVE is not expected to be the same. This is in contrast to computational simulation of a single representative volume element (RVE), which often is untenably large for response variables dependent on the extreme value microstructure attributes. This framework has been demonstrated in the context of characterizing microstructure-sensitive high cycle fatigue (HCF) variability due to the processes of fatigue crack formation (nucleation and microstructurally small crack growth) in polycrystalline metallic alloys. Specifically, the framework is exercised to estimate the local driving forces for fatigue crack formation, to validate these with limited existing experiments, and to explore how the extreme value probabilities of certain fatigue indicator parameters (FIPs) affect overall variability in fatigue life in the HCF regime. Various FIPs have been introduced and used previously as a means to quantify the potential for fatigue crack formation based on experimentally observed mechanisms. Distributions of the extreme value FIPs are calculated for multiple SVEs simulated via the FEM with crystal plasticity constitutive relations. By using crystal plasticity relations, the FIPs can be computed based on the cyclic plastic strain on the scale of the individual grains. These simulated SVEs are instantiated such that they are statistically similar to real microstructures in terms of the crystallographic microstructure attributes that are hypothesized to have the most influence on the extreme value HCF response. The polycrystalline alloys considered here include the Ni-base superalloy IN100 and the Ti alloy Ti-6Al-4V. In applying this framework to study the microstructure dependent variability of HCF in these alloys, the extreme value distributions of the FIPs and associated extreme value marked correlations of crystallographic microstructure attributes are characterized. This information can then be used to rank order multiple variants of the microstructure for a specific material system for relative HCF performance or to design new microstructures hypothesized to exhibit improved performance. This framework enables limiting the (presently) large number of experiments required to characterize scatter in HCF and lends quantitative support to designing improved, fatigue-resistant materials and accelerating insertion of modified and new materials into service.
99

金融風險測度與極值相依之應用─以台灣金融市場為例 / Measuring financial risk and extremal dependence between financial markets in Taiwan

劉宜芳 Unknown Date (has links)
This paper links two applications of Extreme Value Theory (EVT) to analyze Taiwanese financial markets: 1. computation of Value at Risk (VaR) and Expected Shortfall (ES) 2. estimates of cross-market dependence under extreme events. Daily data from the Taiwan Stock Exchange Capitalization Weight Stock Index (TAIEX) and the foreign exchange rate, USD/NTD, are employed to analyze the behavior of each return and the dependence structure between the foreign exchange market and the equity market. In the univariate case, when computing risk measures, EVT provides us a more accurate way to estimate VaR. In bivariate case, when measuring extremal dependence, the results of whole period data show the extremal dependence between two markets is asymptotically independent, and the analyses of subperiods illustrate that the relation is slightly dependent in specific periods. Therefore, there is no significant evidence that extreme events appeared in one market (the equity market or the foreign exchange market) will affect another in Taiwan.
100

Peak Sidelobe Level Distribution Computation for Ad Hoc Arrays using Extreme Value Theory

Krishnamurthy, Siddhartha 25 February 2014 (has links)
Extreme Value Theory (EVT) is used to analyze the peak sidelobe level distribution for array element positions with arbitrary probability distributions. Computations are discussed in the context of linear antenna arrays using electromagnetic energy. The results also apply to planar arrays of random elements that can be transformed into linear arrays. / Engineering and Applied Sciences

Page generated in 0.041 seconds