• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 361
  • 149
  • 78
  • 28
  • 10
  • 10
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • Tagged with
  • 854
  • 120
  • 112
  • 110
  • 106
  • 106
  • 95
  • 74
  • 63
  • 60
  • 59
  • 58
  • 58
  • 57
  • 57
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

On some damage processes in risk and epidemic theories

Gathy, Maude 14 September 2010 (has links)
Cette thèse traite de processus de détérioration en théorie du risque et en biomathématique.<p><p>En théorie du risque, le processus de détérioration étudié est celui des sinistres supportés par une compagnie d'assurance.<p><p>Le premier chapitre examine la distribution de Markov-Polya comme loi possible pour modéliser le nombre de sinistres et établit certains liens avec la famille de lois de Katz/Panjer. Nous construisons la loi de Markov-Polya sur base d'un modèle de survenance des sinistres et nous montrons qu'elle satisfait une récurrence élégante. Celle-ci permet notamment de déduire un algorithme efficace pour la loi composée correspondante. Nous déduisons la famille de Katz/Panjer comme famille limite de la loi de Markov-Polya.<p><p>Le second chapitre traite de la famille dite "Lagrangian Katz" qui étend celle de Katz/Panjer. Nous motivons par un problème de premier passage son utilisation comme loi du nombre de sinistres. Nous caractérisons toutes les lois qui en font partie et nous déduisons un algorithme efficace pour la loi composée. Nous examinons également son indice de dispersion ainsi que son comportement asymptotique. <p><p>Dans le troisième chapitre, nous étudions la probabilité de ruine sur horizon fini dans un modèle discret avec taux d'intérêt positifs. Nous déterminons un algorithme ainsi que différentes bornes pour cette probabilité. Une borne particulière nous permet de construire deux mesures de risque. Nous examinons également la possibilité de faire appel à de la réassurance proportionelle avec des niveaux de rétention égaux ou différents sur les périodes successives.<p><p>Dans le cadre de processus épidémiques, la détérioration étudiée consiste en la propagation d'une maladie de type SIE (susceptible - infecté - éliminé). La manière dont un infecté contamine les susceptibles est décrite par des distributions de survie particulières. Nous en déduisons la distribution du nombre total de personnes infectées à la fin de l'épidémie. Nous examinons en détails les épidémies dites de type Markov-Polya et hypergéométrique. Nous approximons ensuite cette loi par un processus de branchement. Nous étudions également un processus de détérioration similaire en théorie de la fiabilité où le processus de détérioration consiste en la propagation de pannes en cascade dans un système de composantes interconnectées. <p><p><p> / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
512

Some Contributions to Distribution Theory and Applications

Selvitella, Alessandro 11 1900 (has links)
In this thesis, we present some new results in distribution theory for both discrete and continuous random variables, together with their motivating applications. We start with some results about the Multivariate Gaussian Distribution and its characterization as a maximizer of the Strichartz Estimates. Then, we present some characterizations of discrete and continuous distributions through ideas coming from optimal transportation. After this, we pass to the Simpson's Paradox and see that it is ubiquitous and it appears in Quantum Mechanics as well. We conclude with a group of results about discrete and continuous distributions invariant under symmetries, in particular invariant under the groups $A_1$, an elliptical version of $O(n)$ and $\mathbb{T}^n$. As mentioned, all the results proved in this thesis are motivated by their applications in different research areas. The applications will be thoroughly discussed. We have tried to keep each chapter self-contained and recalled results from other chapters when needed. The following is a more precise summary of the results discussed in each chapter. In chapter \ref{chapter 2}, we discuss a variational characterization of the Multivariate Normal distribution (MVN) as a maximizer of the Strichartz Estimates. Strichartz Estimates appear as a fundamental tool in the proof of wellposedness results for dispersive PDEs. With respect to the characterization of the MVN distribution as a maximizer of the entropy functional, the characterization as a maximizer of the Strichartz Estimate does not require the constraint of fixed variance. In this chapter, we compute the precise optimal constant for the whole range of Strichartz admissible exponents, discuss the connection of this problem to Restriction Theorems in Fourier analysis and give some statistical properties of the family of Gaussian Distributions which maximize the Strichartz estimates, such as Fisher Information, Index of Dispersion and Stochastic Ordering. We conclude this chapter presenting an optimization algorithm to compute numerically the maximizers. Chapter \ref{chapter 3} is devoted to the characterization of distributions by means of techniques from Optimal Transportation and the Monge-Amp\`{e}re equation. We give emphasis to methods to do statistical inference for distributions that do not possess good regularity, decay or integrability properties. For example, distributions which do not admit a finite expected value, such as the Cauchy distribution. The main tool used here is a modified version of the characteristic function (a particular case of the Fourier Transform). An important motivation to develop these tools come from Big Data analysis and in particular the Consensus Monte Carlo Algorithm. In chapter \ref{chapter 4}, we study the \emph{Simpson's Paradox}. The \emph{Simpson's Paradox} is the phenomenon that appears in some datasets, where subgroups with a common trend (say, all negative trend) show the reverse trend when they are aggregated (say, positive trend). Even if this issue has an elementary mathematical explanation, the statistical implications are deep. Basic examples appear in arithmetic, geometry, linear algebra, statistics, game theory, sociology (e.g. gender bias in the graduate school admission process) and so on and so forth. In our new results, we prove the occurrence of the \emph{Simpson's Paradox} in Quantum Mechanics. In particular, we prove that the \emph{Simpson's Paradox} occurs for solutions of the \emph{Quantum Harmonic Oscillator} both in the stationary case and in the non-stationary case. We prove that the phenomenon is not isolated and that it appears (asymptotically) in the context of the \emph{Nonlinear Schr\"{o}dinger Equation} as well. The likelihood of the \emph{Simpson's Paradox} in Quantum Mechanics and the physical implications are also discussed. Chapter \ref{chapter 5} contains some new results about distributions with symmetries. We first discuss a result on symmetric order statistics. We prove that the symmetry of any of the order statistics is equivalent to the symmetry of the underlying distribution. Then, we characterize elliptical distributions through group invariance and give some properties. Finally, we study geometric probability distributions on the torus with applications to molecular biology. In particular, we introduce a new family of distributions generated through stereographic projection, give several properties of them and compare them with the Von-Mises distribution and its multivariate extensions. / Thesis / Doctor of Philosophy (PhD)
513

Numerical analysis and multi-precision computational methods applied to the extant problems of Asian option pricing and simulating stable distributions and unit root densities

Cao, Liang January 2014 (has links)
This thesis considers new methods that exploit recent developments in computer technology to address three extant problems in the area of Finance and Econometrics. The problem of Asian option pricing has endured for the last two decades in spite of many attempts to find a robust solution across all parameter values. All recently proposed methods are shown to fail when computations are conducted using standard machine precision because as more and more accuracy is forced upon the problem, round-off error begins to propagate. Using recent methods from numerical analysis based on multi-precision arithmetic, we show using the Mathematica platform that all extant methods have efficacy when computations use sufficient arithmetic precision. This creates the proper framework to compare and contrast the methods based on criteria such as computational speed for a given accuracy. Numerical methods based on a deformation of the Bromwich contour in the Geman-Yor Laplace transform are found to perform best provided the normalized strike price is above a given threshold; otherwise methods based on Euler approximation are preferred. The same methods are applied in two other contexts: the simulation of stable distributions and the computation of unit root densities in Econometrics. The stable densities are all nested in a general function called a Fox H function. The same computational difficulties as above apply when using only double-precision arithmetic but are again solved using higher arithmetic precision. We also consider simulating the densities of infinitely divisible distributions associated with hyperbolic functions. Finally, our methods are applied to unit root densities. Focusing on the two fundamental densities, we show our methods perform favorably against the extant methods of Monte Carlo simulation, the Imhof algorithm and some analytical expressions derived principally by Abadir. Using Mathematica, the main two-dimensional Laplace transform in this context is reduced to a one-dimensional problem.
514

Electroproduction de pions neutres dans le Hall A au Jefferson Laboratory

Fuchey, E. 22 June 2010 (has links) (PDF)
La décennie passée a vu une forte évolution de l'étude de la structure des hadrons par les processus exclusifs, permettant d'accéder à une description plus complète de cette structure. Les processus exclusifs incluent la diffusion Compton profondément virtuelle, ainsi que la production exclusive de mesons à haute énergie. Ce document s'attache particulièrement à ce dernier, et plus particulièrement à la production exclusive de pions neutres. Cette thèse décrit l'analyse des événements en triple coïncidence H(e,e'gamma gamma)X, qui fut un sous produit abondant de l'expérience DVCS qui a eu lieu durant l'automne 2004 dans le Hall A au Jefferson Laboratory, afin d'extraire la section efficace de ep → epπ0 . Cette section efficace a été mesurée à deux valeurs de quadrimoment de tranfert Q2 =1.9 GeV2 et Q2 =2.3 GeV2 . La précision statistique accomplie pour ces mesures est meilleure que 5 %. Le domaine cinématique permet d'étudier l'évolution en Q2 et en W de la section efficace. Ces résultats ont été comparés avec des calculs inspirés de la phénoménologie de Regge, ainsi qu'avec les prédictions du formalisme des distributions de partons généralisées. Une interprétation dans le cadre de la diffusion profondément inélastique semi-inclusive est également discutée.
515

En undersökning om möjligheterna att använda återanvändningsbara pallband till enhetslaster omlastade till träpallar. / An investigation of the possibility to use reusable pallet strapping for unit loads reloaded to wooden pallets.

Björk, Tomas January 2007 (has links)
<p>En undersökning om möjligheterna att använda återandvändningsbara pallband och de ekonomiska förutsättningarna för detta. En genomsökning av vad som finns på marknaden. Beräkning av de krafter som enhetslasten kan utsätta pallbanden för. Utveckling av egna förslag till återanvändningsbart pallband med snabbspänne och bandning på tre sidor. En översikt av de positiva och negativa effekter som blir vid ett byte från dagens bandning med PP-band till ett återanvändningsbart pallband</p> / <p>An investigation of the possibility to use reusable strapping and the economic conditions for this. Going trough what exist on the market. Calculation of the forces that the unit load can expose the strapping for. Developing of own suggestions for reusable strapping with quick buckle and strapping on three sides. A short look at the positive and negative effects that comes from a change from today’s strapping with PP-strap to a reusable strapping.</p>
516

Logaritmicko-konkávní rozděleni pravděpodobnosti a jejich aplikace / Logarithmic-concave probability distributions and their applications

Zavadilová, Barbora January 2014 (has links)
No description available.
517

Inference for the K-sample problem based on precedence probabilities

Dey, Rajarshi January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Paul I. Nelson / Rank based inference using independent random samples to compare K>1 continuous distributions, called the K-sample problem, based on precedence probabilities is developed and explored. There are many parametric and nonparametric approaches, most dealing with hypothesis testing, to this important, classical problem. Most existing tests are designed to detect differences among the location parameters of different distributions. Best known and most widely used of these is the F- test, which assumes normality. A comparable nonparametric test was developed by Kruskal and Wallis (1952). When dealing with location-scale families of distributions, both of these tests can perform poorly if the differences among the distributions are among their scale parameters and not in their location parameters. Overall, existing tests are not effective in detecting changes in both location and scale. In this dissertation, I propose a new class of rank-based, asymptotically distribution- free tests that are effective in detecting changes in both location and scale based on precedence probabilities. Let X_{i} be a random variable with distribution function F_{i} ; Also, let _pi_ be the set of all permutations of the numbers (1,2,...,K) . Then P(X_{i_{1}}<...<X_{i_{K}}) is a precedence probability if (i_{1},...,i_{K}) belongs to _pi_. Properties of these of tests are developed using the theory of U-statistics (Hoeffding, 1948). Some of these new tests are related to volumes under ROC (Receiver Operating Characteristic) surfaces, which are of particular interest in clinical trials whose goal is to use a score to separate subjects into diagnostic groups. Motivated by this goal, I propose three new index measures of the separation or similarity among two or more distributions. These indices may be used as “effect sizes”. In a related problem, Properties of precedence probabilities are obtained and a bootstrap algorithm is used to estimate an interval for them.
518

Modelos COM-Poisson com correlação / Models COM-Poisson with correlation

Pereira, Glauber Márcio Silveira 23 April 2019 (has links)
Nesta tese são propostas duas distribuições discretas: COM-Poisson correlacionada (CPC) e COM-Poisson generalizada parcialmente correlacionada (CPGPC). Também foram propostos modelos de regressão para a distribuição Poisson generalizada parcialmente correlacionada (PGPC) (proposto por Luceño (1995)). Calculamos a função massa de probabilidade (fmp) para todas as distribuições com duas parametrizações. As distribuições foram construídas usando a mesma expansão feita por (Luceño, 1995) na construção da distribuição Poisson generalizada parcialmente correlacionada. A distribuição CPC(l;f;r) é a mesma expansão da distribuição COM-Poisson zero inflacionada ZICMP(m;f;r). Para a distribuição CPGPC(l;f;r;L;K) foi determinada a função característica, função geradora de probabilidade, momentos e a estimação pelo método de máxima verossimilhança para as duas parametrizações. Fizemos a fmp, quantil e gerador de números aleatórios das distribuições citadas no programa R. / In this thesis two discrete distributions are proposed: Correlated COM-Poisson (CPC) and Generalized partially correlated COM-Poisson (CPGPC). We have also proposed regression models for the Generalized partially correlated Poisson distribution (PGPC) (proposed by Luceño (1995)). We calculated the probability mass function for all distributions with two parametrizations. The distributions were constructed using the same expansion made by Luceño (1995) in the construction of the correlated generalized Poisson distribution. The CPC(l;f;r) Correlated COM-Poisson distribution is the same expansion of the zero-inflated COM-Poisson distribution ZICMP(m;f;r). For the CPGPC(l;f;r;L;K) Generalized partially correlated COM-Poisson distribution, the characteristic function, probability-generating function, moments, and the maximum likelihood estimation for the two parametrizations were determined. We performed the probability mass function, quantile and random number generator of the distributions quoted in program R.
519

Network mechanisms of memory storage in the balanced cortex / Mécanismes de réseau de stockage de mémoire dans le cortex équilibré

Barri, Alessandro 08 December 2014 (has links)
Pas de résumé en français / It is generally maintained that one of cortex’ functions is the storage of a large number of memories. In this picture, the physical substrate of memories is thought to be realised in pattern and strengths of synaptic connections among cortical neurons. Memory recall is associated with neuronal activity that is shaped by this connectivity. In this framework, active memories are represented by attractors in the space of neural activity. Electrical activity in cortical neurones in vivo exhibits prominent temporal irregularity. A standard way to account for this phenomenon is to postulate that recurrent synaptic excitation and inhibition as well as external inputs are balanced. In the common view, however, these balanced networks do not easily support the coexistence of multiple attractors. This is problematic in view of memory function. Recently, theoretical studies showed that balanced networks with synapses that exhibit short-term plasticity (STP) are able to maintain multiple stable states. In order to investigate whether experimentally obtained synaptic parameters are consistent with model predictions, we developed a new methodology that is capable to quantify both response variability and STP at the same synapse in an integrated and statistically-principled way. This approach yields higher parameter precision than standard procedures and allows for the use of more efficient stimulation protocols. However, the findings with respect to STP parameters do not allow to make conclusive statements about the validity of synaptic theories of balanced working memory. In the second part of this thesis an alternative theory of cortical memory storage is developed. The theory is based on the assumptions that memories are stored in attractor networks, and that memories are not represented by network states differing in their average activity levels, but by micro-states sharing the same global statistics. Different memories differ with respect to their spatial distributions of firing rates. From this the main result is derived: the balanced state is a necessary condition for extensive memory storage. Furthermore, we analytically calculate memory storage capacities of rate neurone networks. Remarkably, it can be shown that crucial properties of neuronal activity and physiology that are consistent with experimental observations are directly predicted by the theory if optimal memory storage capacity is required.
520

Contribuições em inferência e modelagem de valores extremos / Contributions to extreme value inference and modeling.

Pinheiro, Eliane Cantinho 04 December 2013 (has links)
A teoria do valor extremo é aplicada em áreas de pesquisa tais como hidrologia, estudos de poluição, engenharia de materiais, controle de tráfego e economia. A distribuição valor extremo ou Gumbel é amplamente utilizada na modelagem de valores extremos de fenômenos da natureza e no contexto de análise de sobrevivência para modelar o logaritmo do tempo de vida. A modelagem de valores extremos de fenômenos da natureza tais como velocidade de vento, nível da água de rio ou mar, altura de onda ou umidade é importante em estatística ambiental pois o conhecimento de valores extremos de tais eventos é crucial na prevenção de catátrofes. Ultimamente esta teoria é de particular interesse pois fenômenos extremos da natureza têm sido mais comuns e intensos. A maioria dos artigos sobre teoria do valor extremo para modelagem de dados considera amostras de tamanho moderado ou grande. A distribuição Gumbel é frequentemente incluída nas análises mas a qualidade do ajuste pode ser pobre em função de presença de ouliers. Investigamos modelagem estatística de eventos extremos com base na teoria de valores extremos. Consideramos um modelo de regressão valor extremo introduzido por Barreto-Souza & Vasconcellos (2011). Os autores trataram da questão de corrigir o viés do estimador de máxima verossimilhança para pequenas amostras. Nosso primeiro objetivo é deduzir ajustes para testes de hipótese nesta classe de modelos. Derivamos a estatística da razão de verossimilhanças ajustada de Skovgaard (2001) e cinco ajustes da estatística da razão de verossimilhanças sinalizada, que foram propostos por Barndorff-Nielsen (1986, 1991), DiCiccio & Martin (1993), Skovgaard (1996), Severini (1999) e Fraser et al. (1999). As estatísticas ajustadas são aproximadamente distribuídas como uma distribuição $\\chi^2$ e normal padrão com alto grau de acurácia. Os termos dos ajustes têm formas compactas simples que podem ser facilmente implementadas em softwares disponíveis. Comparamos a performance do teste da razão de verossimilhanças, do teste da razão de verossimilanças sinalizada e dos testes ajustados obtidos neste trabalho em amostras pequenas. Ilustramos uma aplicação dos testes usuais e suas versões modificadas em conjuntos de dados reais. As distribuições das estatísticas ajustadas são mais próximas das respectivas distribuições limites comparadas com as distribuições das estatísticas usuais quando o tamanho da amostra é relativamente pequeno. Os resultados de simulação indicaram que as estatísticas ajustadas são recomendadas para inferência em modelo de regressão valor extremo quando o tamanho da amostra é moderado ou pequeno. Parcimônia é importante quando os dados são escassos, mas flexibilidade também é crucial pois um ajuste pobre pode levar a uma conclusão completamente errada. Uma revisão da literatura foi feita para listar as distribuições que são generalizações da distribuição Gumbel. Nosso segundo objetivo é avaliar a parcimônia e flexibilidade destas distribuições. Com este propósito, comparamos tais distribuições através de momentos, coeficientes de assimetria e de curtose e índice da cauda. As famílias mais amplas obtidas pela inclusão de parâmetros adicionais, que têm a distribuição Gumbel como caso particular, apresentam assimetria e curtose flexíveis enquanto a distribuição Gumbel apresenta tais características constantes. Dentre estas distribuições, a distribuição valor extremo generalizada é a única com índice da cauda que pode ser qualquer número real positivo enquanto os índices da cauda das outras distribuições são zero. Observamos que algumas generalizações da distribuição Gumbel estudadas na literatura são não identificáveis. Portanto, para estes modelos a interpretação e estimação de parâmetros individuais não é factível. Selecionamos as distribuições identificáveis e as ajustamos a um conjunto de dados simulado e a um conjunto de dados reais de velocidade de vento. Como esperado, tais distribuições se ajustaram bastante bem ao conjunto de dados simulados de uma distribuição Gumbel. A distribuição valor extremo generalizada e a mistura de duas distribuições Gumbel produziram melhores ajustes aos dados do que as outras distribuições na presença não desprezível de observações discrepantes que não podem ser acomodadas pela distribuição Gumbel e, portanto, sugerimos que tais distribuições devem ser utilizadas neste contexto. / The extreme value theory is applied in research fields such as hydrology, pollution studies, materials engineering, traffic management, economics and finance. The Gumbel distribution is widely used in statistical modeling of extreme values of a natural process such as rainfall and wind. Also, the Gumbel distribution is important in the context of survival analysis for modeling lifetime in logarithmic scale. The statistical modeling of extreme values of a natural process such as wind or humidity is important in environmental statistics; for example, understanding extreme wind speed is crucial in catastrophe/disaster protection. Lately this is of particular interest as extreme natural phenomena/episodes are more common and intense. The majority of papers on extreme value theory for modeling extreme data is supported by moderate or large sample sizes. The Gumbel distribution is often considered but the resulting fit may be poor in the presence of ouliers since its skewness and kurtosis are constant. We deal with statistical modeling of extreme events data based on extreme value theory. We consider a general extreme-value regression model family introduced by Barreto-Souza & Vasconcellos (2011). The authors addressed the issue of correcting the bias of the maximum likelihood estimators in small samples. Here, our first goal is to derive hypothesis test adjustments in this class of models. We derive Skovgaard\'s adjusted likelihood ratio statistics Skovgaard (2001) and five adjusted signed likelihood ratio statistics, which have been proposed by Barndorff-Nielsen (1986, 1991), DiCiccio & Martin (1993), Skovgaard (1996), Severini (1999) and Fraser et al. (1999). The adjusted statistics are approximately distributed as $\\chi^2$ and standard normal with high accuracy. The adjustment terms have simple compact forms which may be easily implemented by readily available software. We compare the finite sample performance of the likelihood ratio test, the signed likelihood ratio test and the adjusted tests obtained in this work. We illustrate the application of the usual tests and their modified versions in real datasets. The adjusted statistics are closer to the respective limiting distribution compared to the usual ones when the sample size is relatively small. Simulation results indicate that the adjusted statistics can be recommended for inference in extreme value regression model with small or moderate sample size. Parsimony is important when data are scarce, but flexibility is also crucial since a poor fit may lead to a completely wrong conclusion. A literature review was conducted to list distributions which nest the Gumbel distribution. Our second goal is to evaluate their parsimony and flexibility. For this purpose, we compare such distributions regarding moments, skewness, kurtosis and tail index. The larger families obtained by introducing additional parameters, which have Gumbel embedded in, present flexible skewness and kurtosis while the Gumbel distribution skewness and kurtosis are constant. Among these distributions the generalized extreme value is the only one with tail index that can be any positive real number while the tail indeces of the other distributions investigated here are zero. We notice that some generalizations of the Gumbel distribution studied in the literature are not indetifiable. Hence, for these models meaningful interpretation and estimation of individual parameters are not feasible. We select the identifiable distributions and fit them to a simulated dataset and to real wind speed data. As expected, such distributions fit the Gumbel simulated data quite well. The generalized extreme value distribution and the two-component extreme value distribution fit the data better than the others in the non-negligible presence of outliers that cannot be accommodated by the Gumbel distribution, and therefore we suggest them to be applied in this context.

Page generated in 0.0965 seconds