• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 27
  • 7
  • 6
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 147
  • 42
  • 42
  • 29
  • 29
  • 27
  • 17
  • 13
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Intervalos de confiança para altos quantis oriundos de distribuições de caudas pesadas / Confidence intervals for high quantiles from heavy-tailed distributions.

Michel Helcias Montoril 10 March 2009 (has links)
Este trabalho tem como objetivo calcular intervalos de confiança para altos quantis oriundos de distribuições de caudas pesadas. Para isso, utilizamos os métodos da aproximação pela distribuição normal, razão de verossimilhanças, {\\it data tilting} e gama generalizada. Obtivemos, através de simulações, que os intervalos calculados a partir do método da gama generalizada apresentam probabilidades de cobertura bem próximas do nível de confiança, com amplitudes médias menores do que os outros três métodos, para dados gerados da distribuição Weibull. Todavia, para dados gerados da distribuição Fréchet, o método da razão de verossimilhanças fornece os melhores intervalos. Aplicamos os métodos utilizados neste trabalho a um conjunto de dados reais, referentes aos pagamentos de indenizações, em reais, de seguros de incêndio, de um determinado grupo de seguradoras no Brasil, no ano de 2003 / In this work, confidence intervals for high quantiles from heavy-tailed distributions were computed. More specifically, four methods, namely, normal approximation method, likelihood ratio method, data tilting method and generalised gamma method are used. A simulation study with data generated from Weibull distribution has shown that the generalised gamma method has better coverage probabilities with the smallest average length intervals. However, from data generated from Fréchet distribution, the likelihood ratio method gives the better intervals. Moreover, the methods used in this work are applied on a real data set from 1758 Brazilian fire claims
102

Míry závislosti extrémů v časových řadách / Measures of extremal dependence in time series

Popovič, Viktor January 2017 (has links)
In the present thesis we deal with dependence among extremal values within time series. Concerning this type of relations the commonly used autocorrelation function does not provide sufficient information. Moreover, autocorrelation function is suitable for Gaussian processes while nowadays we often work with heavy-tailed time series. In this thesis we cover two measures of extremal dependence that are used for this type of data. We introduce the coefficient of tail dependence, measure of extremal dependence based on tail characteristics of joint survival function. The second measure is called extremogram, which depends only on the extreme values in the sequence. In addition to the theoretical part, simulation study and application to real data of both described measures including their comparison are performed. Results are stated together with tables and graphical output.
103

Image Alignment

Wagner, Katharina 31 May 2006 (has links)
Aligning two images by point to point correspondence is a hard optimization problem. It can be solved using t-Extremal Optimization or with a modification of this method called Fitness threshold accepting. In this work these two methods are tested and compared to see whether one of the methods should be preferred for image alignment. Since real image data is almost always noisy the performance of the methods under conditions like noisy and outlying data is analyzed too.
104

Analýza výskytu extremálních hodnot v čase a prostoru / Analysis of occurrence of extremal values in time and space

Starý, Ladislav January 2015 (has links)
This thesis describes and compares methods for statistical modeling of spatio- temporal data. Methods are extended by examples and numerical studies on real world data. Basic point of interest is statistical analysis of spatial data with unknown correlation structure and known position in space. Further analysis is focused on spatial data with temporal component - spatio-temporal data. Fi- nally, extremal values and their occurrences are discussed. The main aspiration of my thesis is to provide statistical tools for spatio-temporal data and analysis of extremal values of prediction. 1
105

Gaussian Robust Sequential and Predictive Coding

Song, Lin 10 1900 (has links)
<p>Video coding schemes designed based on sequential or predictive coding models are vulnerable to the loss of encoded frames at the decoder end. Motivated by this observation, in this thesis we propose two new coding models: robust sequential coding and robust predictive coding. For the Gauss-Markov source with the mean squared error distortion measure, we characterize certain supporting hyperplanes of the rate region of these two coding problems. The proof is divided into three steps: 1) it is shown that each supporting hyperplane of the rate region of Gaussian robust sequential coding admits a max-min lower bound; 2) the corresponding min-max upper bound is shown to be achievable by a robust predictive coding scheme; 3) a saddle point analysis proves that the max-min lower bound coincides with the min-max upper bound. Furthermore, it is shown that the proposed robust predictive coding scheme can be implemented using a successive quantization system. Theoretical and experimental results indicate that this scheme has a desirable \self-recovery" property. Our investigation also reveals an information-theoretic minimax theorem and the associated extremal inequalities.</p> / Doctor of Philosophy (PhD)
106

Tail Estimation for Large Insurance Claims, an Extreme Value Approach.

Nilsson, Mattias January 2010 (has links)
<p>In this thesis are extreme value theory used to estimate the probability that large insuranceclaims are exceeding a certain threshold. The expected claim size, given that the claimhas exceeded a certain limit, are also estimated. Two different models are used for thispurpose. The first model is based on maximum domain of attraction conditions. A Paretodistribution is used in the other model. Different graphical tools are used to check thevalidity for both models. Länsförsäkring Kronoberg has provided us with insurance datato perform the study.Conclusions, which have been drawn, are that both models seem to be valid and theresults from both models are essential equal.</p> / <p>I detta arbete används extremvärdesteori för att uppskatta sannolikheten att stora försäkringsskadoröverträffar en vis nivå. Även den förväntade storleken på skadan, givetatt skadan överstiger ett visst belopp, uppskattas. Två olika modeller används. Den förstamodellen bygger på antagandet att underliggande slumpvariabler tillhör maximat aven extremvärdesfördelning. I den andra modellen används en Pareto fördelning. Olikagrafiska verktyg används för att besluta om modellernas giltighet. För att kunna genomförastudien har Länsförsäkring Kronoberg ställt upp med försäkringsdata.Slutsatser som dras är att båda modellerna verkar vara giltiga och att resultaten ärlikvärdiga.</p>
107

Modelování závislosti mezi hydrologickými a meteorologickými veličinami měřenými v několika stanicích / Modelling dependence between hydrological and meteorological variables measured on several stations

Turčičová, Marie January 2012 (has links)
Title: Modelling dependence between hydrological and meteorological variables measured on several stations Author: Bc. Marie Turčičová Department: Department of Probability and Mathematical Statistics Supervisor: Prof. RNDr. Daniela Jarušková CSc., Czech Technical University in Prague, Faculty of Civil Engineering, Department of Mathematics Abstract: The aim of the thesis is to explore the dependence of daily discharge averages of the Opava river on high daily precipitation values in its basin. Three methods are presented that can be used for analyzing the dependence between high values of random variables. Their application on the studied data is also given. First it is the tail-dependence coefficient that measures the dependence between high values of two continuous random variables. The model for the high quantiles of the discharge at a given precipitation value was first determined non-parametrically by quantile regression and then parametrically through the peaks-over-threshold (POT) method. Keywords: extremal dependence, tail-dependence coefficient, quantile regression, peaks over threshold method
108

Modelagem estatística de extremos espaciais com base em processos max-stable aplicados a dados meteorológicos no estado do Paraná / Statistical modelling of spatial extremes based on max-stable processes applied to environmental data in the Parana State

Olinda, Ricardo Alves de 09 August 2012 (has links)
A maioria dos modelos matemáticos desenvolvidos para eventos raros são baseados em modelos probabilísticos para extremos. Embora as ferramentas para modelagem estatística de extremos univariados e multivariados estejam bem desenvolvidas, a extensão dessas ferramentas para modelar extremos espaciais integra uma área de pesquisa em desenvolvimento muito ativa atualmente. A modelagem de máximos sob o domínio espacial, aplicados a dados meteorológicos é importante para a gestão adequada de riscos e catástrofes ambientais nos países que tem a sua economia profundamente dependente do agronegócio. Uma abordagem natural para tal modelagem é a teoria de extremos espaciais e o processo max-stable, caracterizando-se pela extensão de dimensões infinitas da teoria de valores extremos multivariados, podendo-se então incorporar as funções de correlação existentes na geoestatística e consequentemente, verificar a dependência extrema por meio do coeficiente extremo e o madograma. Neste trabalho descreve-se a aplicação de tais processos na modelagem da dependência de máximos espaciais de precipitação máxima mensal do estado do Paraná, com base em séries históricas observadas em estações meteorológicas. Os modelos propostos consideram o espaço euclidiano e uma transformação denominada espaço climático, que permite explicar a presença de efeitos direcionais, resultantes de padrões meteorológicos sinóticos. Essa metodologia baseia-se no teorema proposto por De Haan (1984) e nos modelos de Smith (1990) e de Schlather (2002), verifica-se também o comportamento isotrópico e anisotrópico desses modelos via simulação Monte Carlo. Estimativas são realizadas através da máxima verossimilhança pareada e os modelos são comparados usando-se o Critério de Informação Takeuchi. O algoritmo utilizado no ajuste é bastante rápido e robusto, permitindo-se uma boa eficiência computacional e estatística na modelagem da precipitação máxima mensal, possibilitando-se a modelagem dos efeitos direcionais resultantes de fenômenos ambientais. / The most mathematical models developed for rare events are based on probabilistic models for extremes. Although the tools for statistical modeling of univariate and multivariate extremes are well-developed, the extension of these tools to model spatial extremes data is currently a very active area of research. Modeling of maximum values under the spatial domain, applied to meteorological data is important for the proper management of risks and environmental disasters in the countries where the agricultural sector has great influence on the economy. A natural approach for such modeling is the theory of extreme spatial and max-stable process, characterized by infinite dimensional extension of multivariate extreme value theory, and we can then incorporate the current correlation functions in geostatistics and thus, check the extreme dependence through the extreme coefficient and the madogram. This thesis describes the application of such procedures in the modeling of spatial maximum dependency of monthly maximum rainfall of Paraná State, historical series based on observed meteorological stations. The proposed models consider the Euclidean space and a transformation called climatic space, which makes it possible to explain the presence of directional effects resulting from synoptic weather patterns. This methodology is based on the theorem proposed by De Haan (1984) and Smith (1990) models and Schlather (2002), checking the isotropic and anisotropic behavior these models through Monte Carlo simulation. Estimates are performed using maximum pairwise likelihood and the models are compared using the Takeuchi information criterion. The algorithm used in the fit is very fast and robust, allowing a good statistical and computational efficiency in monthly maximum rainfall modeling, allowing the modeling of directional effects resulting from environmental phenomena.
109

Algoritmos de otimização e criticalidade auto-organizada / Optimization algorithms and self-organized criticality

Castro, Paulo Alexandre de 22 April 2002 (has links)
As teorias científicas surgiram da necessidade do homem entender o funcionamento das coisas. Novos métodos e técnicas são então criados com o objetivo não só de melhor compreender, mas também de desenvolver essas próprias teorias. Nesta dissertação, vamos estudar várias dessas técnicas (aqui chamadas de algoritmos) com o objetivo de obter estados fundamentais em sistemas de spin e de revelar suas possíveis propriedades de auto-organização crítica. No segundo capítulo desta dissertação, apresentamos os algoritmos de otimização: simulated annealing, algoritmo genético, otimização extrema (EO) e evolutivo de Bak-Sneppen (BS). No terceiro capítulo apresentamos o conceito de criticalidade auto-organizada (SOC), usando como exemplo o modelo da pilha de areia. Para uma melhor compreensão da importância da criticalidade auto-organizada, apresentamos vários outros exemplos de onde o fenômeno é observado. No quarto capítulo apresentamos o modelo de relógio quiral de p-estados que será nosso sistema de testes. No caso unidimensional, determinamos a matriz de transferência e utilizamos o teorema de Perron-Frobenius para provar a inexistência de transição de fase a temperaturas finitas a temperaturas finitas. Esboçamos os diagramas de fases dos estados fundamentais que obtivemos de maneira analítica e numérica para os casos de p = 2, 3, 4, 5 e 6, no caso numérico fazendo uso do algoritmo de Bak-Sneppen com sorteio (BSS). Apresentamos ainda um breve estudo do número de mínimos locais para o modelo de relógio quiral de p-estados, para os casos de p = 3 e 4. Por último, no quinto capítulo, propomos uma dinâmica Bak-Sneppen com ruído (BSR) como uma nova técnica de otimização para tratar sistemas discretos. O ruído é introduzido diretamente no espaço de configuração de spins. Conseqüentemente, o fitness (adaptabilidade) passa a assumir valores contínuos, num pequeno intervalo em torno do seu valor original (discreto). Os resultados dessa dinâmica indicam a presença de criticalidade auto-organizada, evidenciada pelo decaimento em leis de potências das correlações espacial e temporal. Também estudamos o método EO e obtivemos uma confirmação numérica de que sua dinâmica exibe um comportamento não crítico com alcance espacial infinito e decaimento exponencial das avalanches. Finalmente, para o modelo de relógio quiral, comparamos a eficiência das três dinâmicas (EO, BSS e BSR) no que tange às suas habilidades de encontrar o estado fundamental do sistema. / In order to understand how things work, man has formulated scientific theories. New methods and techniques have been created not only to increase our understanding on the subject but also to develop and even expand those theories. In this thesis, we study several techniques (here called algorithms) designed with the objective to get the ground states of some spin systems and eventually to reveal possible properties of critical self-organization. In the second chapter, we introduce four fundamental optimization algorithms: simulated annealing, genetics algorithms, extremal optimization (EO) and Bak-Sneppen (BS). In the third chapter we present the concept of self-organized criticality (SOC), using as an example the sandpile model. To understand the importance of the self-organized criticality, we show many other situations where the phenomenon can be observed. In the fourth chapter, we introduce the p-states chiral clock model. This will be our test or toy system. For the one-dimensional case, we first determined the corresponding transfer-matrix and then proved the nonexistence of phase transitions by using the Perron-Frobenius theorem. We calculate the ground state phase diagrams both analytically and numerically in the cases of p = 2, 3, 4, 5 and 6. We also present a brief study of the number of local minima for the cases p = 3 and 4 of the chiral clock model. Finally, in the fifth chapter, we propose a Bak-Sneppen dynamics with noise (BSN) as a new technique of optimization to treat discrete systems. The noise is directly introduced into the spin configuration space. Consequently, the fitness now take values in a continuum but small interval around its original value (discrete). The results of this dynamics indicate the presence of self-organized criticality, which becomes evident with the power law scaling of the spacial and temporal correlations. We also study the EO algorithm and found a numerical con_rmation that it does not show a critical behavior since it has an in_nite space range and an exponential decay of the avalanches. At the end, we compare the e_ciency of the three dynamics (EO, BSD and BSN) for the chiral clock model, concerning their abilities to _nd the system\'s ground state.
110

Quasi-random hypergraphs and extremal problems for hypergraphs

Person, Yury 06 December 2010 (has links)
In dieser Arbeit wird zuerst das Theorem von Chung, Graham und Wilson über quasi-zufällige Graphen zur sogenannten schwachen Quasi-Zufälligkeit für k-uniforme Hypergraphen verallgemeinert und somit eine Reihe äquivalenter Eigenschaften bestimmt. Basierend auf diesen Resultaten werden nichtbipartite Graphen gefunden, welche die Quasi-Zufälligkeit für Graphen ``forcieren''''. Zuvor waren nur bipartite Graphen mit dieser Eigenschaft bekannt. Desweiteren ist ein konzeptionell einfacher Algorithmus zum Verifizieren nicht erfüllbarer zufälliger k-SAT Formeln angegeben. Dann richtet sich der Fokus auf Anwendungen verschiedener Regularitätslemmata für Hypergraphen. Zuerst wird die Menge aller bezeichneten 3-uniformen Hypergraphen auf n Knoten, die keine Kopie des Hypergraphen der Fano Ebene enthalten, studiert. Es wird gezeigt, dass fast jedes Element aus dieser Menge ein bipartiter Hypergraph ist. Dies führt zu einem Algorithmus, der in polynomiell erwarteter Zeit einen zufälligen Fano-freien (und somit einen zufälligen bipartiten 3-uniformen) Hypergraphen richtig färbt. Schließlich wird die folgende extremale Funktion studiert. Es sind r Farben gegeben sowie ein k-uniformer Hypergraph F. Auf wie viele verschiedene Arten kann man die Kanten eines k-uniformen Hypergraphen H färben, so dass keine monochromatische Kopie von F entsteht? Welche Hypergraphen H maximieren die Anzahl erlaubter Kantenfärbungen? Hier wird ein strukturelles Resultat für eine natürliche Klasse von Hypergraphen bewiesen. Es wird für viele Hypergraphen F, deren extremaler Hypergraph bekannt ist, gezeigt, dass im Falle von zwei oder drei Farben die extremalen Hypergraphen die oben beschriebene Funktion maximieren, während für vier oder mehr Farben andere Hypergraphen mehr Kantenfärbungen zulassen. / This thesis presents first one possible generalization of the result of Chung, Graham and Wilson to k-uniform hypergraphs, and studies the so-called weak quasi-randomness. As applications we obtain a simple strong refutation algorithm for random sparse k-SAT formulas and we identify first non-bipartite forcing pairs for quasi-random graphs. Our focus then shifts from the study of quasi-random objects to applications of different versions of the hypergraph regularity lemmas; all these versions assert decompositions of hypergraphs into constantly many quasi-random parts, where the meaning of ``quasi-random'''' takes different contexts in different situations. We study the family of hypergraphs not containing the hypergraph of the Fano plane as a subhypergraph, and show that almost all members of this family are bipartite. As a consequence an algorithm for coloring bipartite 3-uniform hypergraphs with average polynomial running time is given. Then the following combinatorial extremal problem is considered. Suppose one is given r colors and a fixed hypergraph F. The question is: In at most how many ways can one color the hyperedges of a hypergraph H on n vertices such that no monochromatic copy of F is created? What are the extremal hypergraphs for this function? Here a structural result for a natural family of hypergraphs F is proven. For some special classes of hypergraphs we show that their extremal hypergraphs (for large n) maximize the number of edge colorings for 2 and 3 colors, while for at least 4 colors other hypergraphs are optimal.

Page generated in 0.0395 seconds