• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 7
  • 6
  • 1
  • 1
  • Tagged with
  • 55
  • 26
  • 18
  • 14
  • 13
  • 12
  • 10
  • 10
  • 10
  • 10
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Determinação do índice de disponibilidade de umidade para a Região Oeste do Paraná

Maggi, Cacea Furlan 21 February 2006 (has links)
Made available in DSpace on 2017-05-12T14:47:59Z (GMT). No. of bitstreams: 1 Cacea Furlan Maggi.pdf: 2357763 bytes, checksum: f687985074f0d2f2b97598158d904ab2 (MD5) Previous issue date: 2006-02-21 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / The objective of the present study compare the models of estimation of evapotranspiration of Reference (ETo) was adjusted of probabilistic Gama, Lognoral, and generalization distribution of extreme value, solver behind of Camargo, on Paraná West and to determine the wet availability index. The dados was obtained in IAPAR (Paraná Institute Agronomy) end SIMEPAR (Paraná Meteorology System). Worked of medium month temperature dados of Paraná regions, with geographic localization 24º14 00 latitude S to 25º27 00 and 53º07 00 to 54º24 00 longitude W far from Greenwch. The historic series done showed dados of 6 to 32 year of month medium temperature. The month medium temperature was used for solver of ETo. With of month dados was adjusted of probabilistic models associate of 75% occur. For to tried of probabilistic Gama and Lognormal models was to used adherences test of Kolmogorov-Smirnov, with 5% of probability. For GEV distribution of qualify of adjusted was available of Wang test of 5% probability. This values of P75 was used for wet availability index. The result was probabilistic Gama and Lognormal model showed similar comportment with to estimative of ETo, therefore all 144 series studied was accept of Kolmigorov-Smirnov adherence test, until of station with a few year of register adjusted, consequently the two models adjusted correct the ETo dados. The GEV model of 144 series available, 22 don t was accept of Wang test of 5% probability. On wet availability index showed variation of 0.33 to 1.33, how reference for wet availability analyses that valor less of 0.33 are to point out of deficit end tall of 1.33 wet excess. The results showed of June was the month the most valor of IDUs, was in this month too of occur better station number with wet excess, the 12 station available, 8 showed wet excess in this month. Was possible do see the most valor os IDUs happened in regions next, in São Miguel do Iguaçu and Foz do Iguaçu. All the series available don t shoed wet deficit. / O presente trabalho teve como objetivo comparar os modelos de estimativa de Evapotranspiração de Referência (ETo) Gama, Lognormal, e o modelo de distribuição generalizada de valores extremos (GEV), calculada através do modelo de Camargo, na região Oeste do Paraná e determinar o Índice de Disponibilidade de Umidade IDU. Os dados climáticos foram obtidos através do IAPAR (Instituto Agronômico do Paraná) e do SIMEPAR (Sistema Meteorológico do Paraná). Trabalhou-se com dados de temperatura média mensal da região Oeste do Paraná, cuja localização geográfica das estas estações se encontram entre - latitude: 24º17 00 a 25º27 00 S, e longitude: 53º07 00 a 54º24 00 W de Greenwch. As séries históricas utilizadas apresentavam dados de 6 a 32 anos de temperatura média mensal. Os dados de temperatura média mensal foram utilizados para calcular a Eto. A partir dos dados Eto mensais procedeu-se a verificação do ajuste dos modelos probabilísticos associados ao nível de 75% de ocorrência. Para a validação para os modelos probabilístico Gama e Lognormal foram utilizados testes de aderência de Kolmogorov- Smirnov, com significância de 5% de probabilidade. Para a distribuição GEV a qualidade do ajuste foi avaliada através do teste de Wang ao nível de 5% de probabilidade. Os valores do P75 utilizados P75 foram utilizados para o cálculo do Índice de Disponibilidade de Umidade. Os resultados foram que os modelos probabilísticos Gama e Lognomal apresentaram comportamentos semelhantes com relação à estimativa de Eto, pois todas as 144 séries estudadas todas foram aceitas pelo teste de aderência de Kolmogorov-Smirnov, até mesmo as estações com poucos anos de registros se ajustaram, portanto os dois modelos ajustam adequadamente os dados de Eto. Para o modelo GEV das 144 séries avaliadas 22 séries não foram aceitas pelo teste de Wang para o nível de 5% de probabilidade. Na determinação do Índice de Disponibilidade de Umidade que indica a variação de 0,33 £ IDU £ 1,33, como referência para a análise da disponibilidade de umidade, sendo que valores abaixo de 0,33 são indicações de déficit e acima de 1,33 de excesso hídrico. Os resultados mostram que no mês de junho foi o mês que apresentou os maiores valores de IDUs, foi neste mês também o que se verificou o maior número de estações com excesso hídrico, das 12 estações avaliadas 8 apresentaram excesso hídrico neste mês. Foi possível se verificar que os maiores valores de IDUs ocorreram em regiões próximas, que foram as regiões de São Miguel do Iguaçu e Foz do Iguaçu. Em todas as séries avaliadas, nenhuma apresentou déficit hídrico.
12

[en] VALUE AT RISK A COMPARISON OF METHODS TO CHOOSE THE SAMPLE FRACTION IN TAIL INDEX ESTIMATION OF GENERALIZED EXTREME VALUE DISTRIBUTION / [pt] VALOR EM RISCO UMA COMPARAÇÃO ENTRE MÉTODOS DE ESCOLHA DA FRAÇÃO AMOSTRAL NA ESTIMAÇÃO DO ÍNDICE DE CAUDA DE DISTRIBUIÇÕES GEV

CHRISTIAM MIGUEL GONZALES CHAVEZ 28 August 2002 (has links)
[pt] Valor em Risco -VaR- já é parte das ferramentas habituais que um analista financeiro utiliza para estimar o risco de mercado. Na implementação do VaR é necessário que seja estimados quantis de baixa probabilidade para a distribuição condicional dos retornos dos portfólios. A metodologia tradicional para o cálculo do VaR requer a estimação de um modelo tipo GARCH com distribuição normal. Entretanto, a hipótese de normalidade condicional nem sempre é adequada, principalmente quando se deseja estimar o VaR em períodos atípicos, caracterizados pela ocorrência de eventos extremos. Nesta situações a distribuição condicional deve apresentar excesso de curtose. O uso de distribuições derivadas do Teorema do Valor Extremos -TVE-, conhecidas coletivamente como GEV,associadas aos modelos tipo GARCH, tornou possível o cálculo do VaR nestas situações.Um parâmetro chave nas distribuições da família GEV é o índice de cauda, o qual pode ser estimado através do estimador de Hill. Entretanto este estimador apresenta muita sensibilidade em termos de variância e viés com respeito à fração amostral utilizada na sua estimação. O objetivo principal desta dissertação foi fazer uma comparação entre três métodos de escolha da fração amostral, recentemente sugeridos na literatura: o método bootstrap duplo Danielsson, de Haan, Peng e de Vries 1999, o método threshold Guillou e Hall 2001 e o Hill plot alternativo Drees, de Haan e Resnick 2000. A avaliação dos métodos foi feita através do teste de cobertura condicional de Christoffersen 1998, o qual foi aplicado às séries de retornos dos índices: NASDAQ, NIKKEY,MERVAL e IBOVESPA. Os nossos resultados indicam que os três métodos apresentam aproximadamente o mesmo desempenho, com uma ligeira vantagem dos métodos bootstrap duplo e o threshold sobre o Hill plot alternativo, porque este ultimo tem um componente normativo na determinação do índice de cauda ótimo. / [en] Value at Risk -VaR- is already part of the toolkit of financial analysts assessing market risk. In order to implement VaR it is needed to estimate low quantiles of the portfolio returns distribution. Traditional methodologies combine a normal conditional distribution together with ARCH type models to accomplish this goal. Albeit well succeed in evaluating risk for typical periods, this methodology has not been able to accommodate events that occur with very low probabilities. For these situations one needs conditional distributions with excess of kurtosis. The use of distributions derived from the Extreme Value Theory -EVT-, collectively known as Generalized Extreme Value distribution -GEV-, together with ARCH type models have made it possible to address this problem in a proper framework. A key parameter in the GEV distribution is the tail index, which can be estimated by Hill`s estimator. Hill`s estimator is very sensible, in terms of bias and RMSE, to the sample fraction that is used in its estimation. The objective of this dissertation is to compare three recently suggested methods presented in the statistical literature: the double bootstrap method Danielsson, de Haan, Peng and de Vries 1999,the threshold method Guillou and Hall 2001 and the alternative Hill plot Drees, de Haan and Resnick 2000. The methods have been evaluated with respect to the conditional coverage test of Christoffersen 1998, which has been applied to the following returns series : NASDAQ, NIKKEY, MERVAL e IBOVESPA. Our empirical findings suggests that, overall the three methods have the same performance, with some advantage of the bootstrap and threshold methods over the alternative Hill plot, which has a normative component in the determination of the optimal tail index.
13

Développement d'un modèle statistique non stationnaire et régional pour les précipitations extrêmes simulées par un modèle numérique de climat / A non-stationary and regional statistical model for the precipitation extremes simulated by a climate model

Jalbert, Jonathan 30 October 2015 (has links)
Les inondations constituent le risque naturel prédominant dans le monde et les dégâts qu'elles causent sont les plus importants parmi les catastrophes naturelles. Un des principaux facteurs expliquant les inondations sont les précipitations extrêmes. En raison des changements climatiques, l'occurrence et l'intensité de ces dernières risquent fort probablement de s'accroître. Par conséquent, le risque d'inondation pourrait vraisemblablement s'intensifier. Les impacts de l'évolution des précipitations extrêmes sont désormais un enjeu important pour la sécurité du public et pour la pérennité des infrastructures. Les stratégies de gestion du risque d'inondation dans le climat futur sont essentiellement basées sur les simulations provenant des modèles numériques de climat. Un modèle numérique de climat procure notamment une série chronologique des précipitations pour chacun des points de grille composant son domaine spatial de simulation. Les séries chronologiques simulées peuvent être journalières ou infra-journalières et elles s'étendent sur toute la période de simulation, typiquement entre 1961 et 2100. La continuité spatiale des processus physiques simulés induit une cohérence spatiale parmi les séries chronologiques. Autrement dit, les séries chronologiques provenant de points de grille avoisinants partagent souvent des caractéristiques semblables. De façon générale, la théorie des valeurs extrêmes est appliquée à ces séries chronologiques simulées pour estimer les quantiles correspondants à un certain niveau de risque. La plupart du temps, la variance d'estimation est considérable en raison du nombre limité de précipitations extrêmes disponibles et celle-ci peut jouer un rôle déterminant dans l'élaboration des stratégies de gestion du risque. Par conséquent, un modèle statistique permettant d'estimer de façon précise les quantiles de précipitations extrêmes simulées par un modèle numérique de climat a été développé dans cette thèse. Le modèle développé est spécialement adapté aux données générées par un modèle de climat. En particulier, il exploite l'information contenue dans les séries journalières continues pour améliorer l'estimation des quantiles non stationnaires et ce, sans effectuer d'hypothèse contraignante sur la nature de la non-stationnarité. Le modèle exploite également l'information contenue dans la cohérence spatiale des précipitations extrêmes. Celle-ci est modélisée par un modèle hiérarchique bayésien où les lois a priori des paramètres sont des processus spatiaux, en l'occurrence des champs de Markov gaussiens. L'application du modèle développé à une simulation générée par le Modèle régional canadien du climat a permis de réduire considérablement la variance d'estimation des quantiles en Amérique du Nord. / Precipitation extremes plays a major role in flooding events and their occurrence as well as their intensity are expected to increase. It is therefore important to anticipate the impacts of such an increase to ensure the public safety and the infrastructure sustainability. Since climate models are the only tools for providing quantitative projections of precipitation, flood risk management for the future climate may be based on their simulations. Most of the time, the Extreme value theory is used to estimate the extreme precipitations from a climate simulation, such as the T-year return levels. The variance of the estimations are generally large notably because the sample size of the maxima series are short. Such variance could have a significant impact for flood risk management. It is therefore relevant to reduce the estimation variance of simulated return levels. For this purpose, the aim of this paper is to develop a non-stationary and regional statistical model especially suited for climate models that estimates precipitation extremes. At first, the non-stationarity is removed by a preprocessing approach. Thereafter, the spatial correlation is modeled by a Bayesian hierarchical model including an intrinsic Gaussian Markov random field. The model has been used to estimate the 100-year return levels over North America from a simulation by the Canadian Regional Climate Model. The results show a large estimation variance reduction when using the regional model.
14

Extreme value analysis of non-stationary time series: Quantifying climate change using observational data throughout Germany

Müller, Philipp 11 March 2019 (has links)
The overall subject of this thesis is the massive parallel application of the extreme value analysis (EVA) on climatological time series. In this branch of statistics one strives to learn about the tails of a distribution and its upper quantiles, like the so-called 50 year return level, an event realized on average only once during its return period of 50 years. Since most studies just focus on average statistics and it's the extreme events that have the biggest impact on our life, such an analysis is key for a proper understanding of the climate change. In there a time series gets separated into blocks, whose maxima can be described using the generalized extreme value (GEV) distribution for sufficiently large block sizes. But, unfortunately, the estimation of its parameters won't be possible on a massive parallel scale with any available software package since they are all affected by onceptional problems in the maximum likelihood fit. Both the logarithms in the negative log-likelihood of the GEV distribution and the theoretical limitations on one of its parameters give rise to regions in the parameter space inaccessible to the optimization routines, causing them to produce numerical artifacts. I resolved this issue by incorporating all constraints into the optimization using the augmented Lagrangian method. With my implementation in the open source package **climex** it is now possible to analyze large climatological data sets. In this thesis I used temperature and precipitation data from measurement stations provided by the German weather service (DWD) and the ERA-Interim reanalysis data set and analyzed them using both a qualitative method based on time windows and a more quantitative one relying on the class of vector generalized linear models (VGLM). Due to the climate change a general shift of the temperature towards higher values and thus more hot and less cold extremes would be expect. Indeed, I could find the cation parameters of the GEV distributions, which can be thought of as the mean event size at a return period of approximately the block size of one year, to increase for both the aily maximum and minimum temperatures. But the overall changes are far more complex and dependent on the geographical location as well as the considered return period, hich is quite unexpected. E.g. for the 100 year return levels of the daily maximum temperatures a decrease was found in the east and the center of Germany for both the raw series and their anomalies, as well as a quite strong reduction for the raw series in the very south of Germany. The VGLM-based non-stationary EVA resulted in significant trends in the GEV parameters for the daily maximum temperatures of almost all stations and for about half of them in case of the daily minima. So, there is statistically sound evidence for a change in the extreme temperatures and, surprisingly, it is not exclusively towards higher values. The analysis yielded several significant trends featuring a negative slope in the 10 year return levels. The analysis of the temperature data of the ERA-Interim reanalysis data set yielded quite surprising results too. While in some parts of the globe, especially on land, the 10 year return levels were found to increase, they do in general decrease in most parts of the earth and almost entirely over the sea. But since we found a huge discrepancy between the results of the analysis using the station data within Germany and the results obtained for the corresponding grid points of the reanalysis data set, we can not be sure whether the patterns in the return levels of the ERA-Interim data are trustworthy. / Das Ziel dieser Arbeit ist die massiv parallele Anwendung der Extremwertanalyse (EVA) auf klimatologischen Zeitreihen. Dieser Bereich der Statistik beschäftigt sich mit den Schwänzen von Wahrscheinlichkeitsverteilungen und deren großen Quantilen, wie z.B. dem sogenannten 50-jährigen Return Level. Dies ist ein Ereignis, welches im Mittel nur einmal innerhalb seiner Return Periode von 50 Jahren realisiert wird. Da sich aber die Mehrheit der wissenschaftlichen Studien auf die Analyse gemittelter statistischer Größen stützen, aber es gerade die extremen Ereignisse sind, welche unser Leben maßgeblich beeinflussen, ist eine solche EVA entscheidend für ein umfassendes Verständnis des Klimawandels. In der Extremwertanalyse wird eine Zeitreihe in einzelne Blöcke geteilt, deren Maxima sich bei hinreichend großer Blocklänge mittels der generalisierten Extremwertverteilung (GEV) beschreiben lassen. Die Schätzung ihrer Parameter ist auf solch massiv parallelen Skalen jedoch mit keinem der verfügbaren Softwarepakete möglich, da sie alle vom selben konzeptionellen Problem der Maximum Likelihood Methode betroffen sind. Sowohl die Logarithmen in der negativen log-Likelihood der GEV Verteilung, als auch die theoretischen Beschränkungen im Wertebereich eines ihrer Parameter machen Teile des Parameterraumes für den Optimierungsalgorithmus unzugänglich und führen zur Erzeugung numerischer Artefakte durch die Routine. Dieses Problem konnte ich lösen, indem ich die Beschränkungen mittels der augmented Lagrangian Methode in die Optimierung integrierte. Mittels dem verbesserten Fit, den ich in dem Open Source Paket **climex** zur Verfügung stellte, ist es nun möglich beliebig viele Zeitreihen in einer parallelen Analyse zu behandeln. In dieser Arbeit verwende ich Temperatur- und Niederschlagszeitreihen des deutschen Wetterdienstes (DWD) und den ERA-Interim Reanalyse Datensatz in Kombination mit sowohl einer qualitativen Analyse basierend auf Zeitfenstern, als auch einer quantitativen, welche auf der Modellklasse der Vektor-generalisierten linearen Modellen (VGLM) beruht. Aufgrund des Klimawandels ist intuitiv eine Verschiebung der Temperaturverteilung zu höheren Werten und damit mehr heiße und weniger kalte Temperaturextreme zu erwarten. Tatsächlich konnte ich für die täglichen Maximal- und Minimaltemperaturen einen Anstieg des Location Parameters finden, dem man sich als mittlere Ereignisgröße für eine Return Periode gleich der verwendeten Blocklänge von einem Jahr versinnbildlichen kann. Im Großen und Ganzen sind die Änderungen jedoch deutlich komplexer und hängen sowohl vom Ort, als auch von der Return Periode ab. Z.B. verringern sich die 100 jährigen Return Level der täglichen Maximaltemperaturen im Osten und im Zentrum Deutschlands für sowohl die unprozessierten Zeitreihen, als auch für deren Anomalien, und weisen eine besonders starke Reduktion im Süden des Landes für die prozessierten auf. Durch die VGLM-basierte, nicht-stationäre EVA konnte ich zeigen, dass nahezu alle Stationen für die täglichen Maximaltemperaturen, sowie rund die Hälfte aller Stationen für die täglichen Minimaltemperaturen, signifikante Trends in den Parameters der GEV Verteilung aufweisen. Somit war es mir möglich statistisch fundierte Beweise für Veränderungen in den extremen Temperaturen finden, die jedoch nicht ausschließlich in einer Verschiebung zu höheren Werten bestanden. Einige Stationen wiesen eine negativen Trend in ihren 10 jährigen Return Leveln auf. Die Analyse der Temperaturzeitreihen des ERA-Interim Reanalyse Datensatzes ergab ebenfalls überraschende Resultate. Während in einigen Teilen der Welt, hauptsächlich an Land, die 10 jährigen Return Level steigen, sinkt ihr Wert für den Großteil der Zeitreihen und fast über den gesamten Ozeanen. Da jedoch eine große Diskrepanz zwischen den Ergebnissen der Stationsdaten des DWD und den dazugehörigen Rasterpunkten im ERA-Interim Datensatz besteht, konnte nicht abschließend geklärt werden in wieweit die Resultate der Rasteranalyse der Natur entsprechen.
15

A search for scalar electrons and muons using the DELPHI detector at LEP2

Hughes, Gareth James January 2000 (has links)
No description available.
16

Multi-strange hyperon production in relativistic heavy -ion collisions

Barton, Robert Allan January 2001 (has links)
No description available.
17

非齊質變異下尾端風險的衡量

陳俊宏 Unknown Date (has links)
論文名稱:非齊質變異下尾端風險的衡量 校所組別:國立政治大學國際貿易學研究所 指導教授:饒秀華博士 研究生:陳俊宏 關鍵字:風險值、極值理論、厚尾、GPD、GEV、HILL、GARCH模型 論文摘要 台灣加入世界貿易組織(WT0)之後,也相對宣示了國內企業邁向國際化與自由化的向前邁進一步,對於銀行、進、出口商在匯率使用上將會更加的頻繁,因此面對匯率的風險將是無法避免,本研究的目標為每一塊美元兌換台幣的即期匯率資料,以基本的歷史模擬法、變異數-共變異數法,比較極值理論所使用的方法是否有差異存在,而平常使用的非條件模型與條件的GARCH模型比較,條件模型是否能夠比非條件的模型更能正確地估計風險值;另外,條件模型在多日風險值的估計時是否還保有其適用性。 實證結果顯示:整體上而言,在1日風險值估計的模型上,條件模型上的假設確實比非條件的模型較好。在1日的風險值估計下,條件極值理論的使用上比條件變異數-共變異數法或歷史模擬法所估計出來的風險值表現的結果好。在多日的風險值估計下,1日風險值估計模型表現最佳的條件極值理論的模型,卻沒有依然表現的很好,原因是GARCH模型可能無法在時間拉長時,依然做到最好的估計,此時,或許使用非條件的Hill模型以λ<sub>t</sub>的方法來估計風險值,就可以達到不錯的結果。
18

Cross Section of $b\bar{b}$ Production in p+p Collisions at $\sqrt{s}$=500 GeV Using Like-Sign Dimuons at PHENIX

Patel, Laura B 01 August 2013 (has links)
Lepton pairs resulting from the decay of heavy flavor mesons are an important tool to probe the hot and dense matter created in nucleus-nucleus collisions at the Relativistic Heavy Ion Collider. Due to their large mass, heavy quarks are produced in the earliest stages of the collision and will, therefore, experience the full evolution of the system. The yield of heavy flavor mesons can be measured through their semi-leptonic decay channel by constructing like-sign and unlike-sign lepton pairs. Cross section measurements in p + p collisions provide a test of perturbative quantum chromodynamics (pQCD) theory in addition to a crucial baseline measurement to study the hot and cold nuclear matter effects present in heavy ion collisions. For the first time, the b¯b cross section in p+p collisions at √s = 500 GeV is measured. The results are based on the yield of high mass, like-sign dimuons measured in the PHENIX muon arm acceptance (1.2 < |y| < 2.2). The extrapolated total cross section is 25.2 ± 3.2 (stat) +11.4 -9.5 µb (sys). The cross section is comparable to pQCD calculation within uncertainties.
19

A J/\Psi Polarization Measurement with the Phenix Muon Arms in Proton+Proton Collisions at Center of Mass Energy of 200 GEV at RHIC

Qu, Hai 20 November 2008 (has links)
A measurement of J/\Psi polarization has been performed for 200 GeV proton+proton collisions with the PHENIX Muon Arms at RHIC. The results from the current data show no polarization within the PHENIX acceptance range. The results are consistent with the current model predictions and other experimental measurements.
20

Cross Section of Bottom Quark Production in p+p Collisions at √s= 500 GeV Using Like-Sign Dimuons at PHENIX

Patel, Laura B. 01 August 2013 (has links)
Lepton pairs resulting from the decay of heavy flavor mesons are an important tool to probe the hot and dense matter created in nucleus-nucleus collisions at the Relativistic Heavy Ion Collider. Due to their large mass, heavy quarks are produced in the earliest stages of the collision and will, therefore, experience the full evolution of the system. The yield of heavy flavor mesons can be measured through their semi-leptonic decay channel by constructing like-sign and unlike-sign lepton pairs. Cross section measurements in p + p collisions provide a test of perturbative quantum chromodynamics (pQCD) theory in addition to a crucial baseline measurement to study the hot and cold nuclear matter effects present in heavy ion collisions. For the first time, the b¯b cross section in p+p collisions at √s = 500 GeV is measured. The results are based on the yield of high mass, like-sign dimuons measured in the PHENIX muon arm acceptance (1.2 < |y| < 2.2). The extrapolated total cross section is 25.2 ± 3.2 (stat) +11.4 -9.5 µb (sys). The cross section is comparable to pQCD calculation within uncertainties.

Page generated in 0.032 seconds